Thank you for your response!
I tested the split command and I got another problem
because for each file doesn't have the "COPY" header.
As well I said early, one table has more than 30G and
I need to import to another server (linux), but by dvd
media because doens't have network connection.
Whats the best way to do this.
If doesn't have another way, how can I put the
"header" in the begin of file without open?
With "cat >>" command I put in the end.
Could you help me?
Once again, thank you!
--- Scott Marlowe <scott(dot)marlowe(at)gmail(dot)com> escreveu:
> On Dec 17, 2007 2:06 PM, A.Burbello
> <burbello3000(at)yahoo(dot)com(dot)br> wrote:
> > I consider this way a good practice to transport
> > files. This is because I have table that has more
> > 30GB.
> > But if the OS was windows, couldn't split the
> > because postgres doesn't has this feature!
> > Could be a good option if postgres had native.
> > Weel, I will try this:
> > eg: $ pg_dump postgres -U postgres -f split.txt |
> > split --bytes=10m
> for now. I used these back in the day (NT4.0 SP4 or
> so) and they
> worked a charm back then. Heck, even ln worked ( in
> a manner of
> speaking ) back then.
> ---------------------------(end of
> TIP 9: In versions below 8.0, the planner will
> ignore your desire to
> choose an index scan if your joining column's
> datatypes do not
Abra sua conta no Yahoo! Mail, o único sem limite de espaço para armazenamento!
In response to
pgsql-admin by date
|Next:||From: Tena Sakai||Date: 2007-12-17 23:38:57|
|Subject: How would I "close" a atble?|
|Previous:||From: Geoffrey||Date: 2007-12-17 23:08:41|
|Subject: Re: Dump database more than 1 flat file|