From: | Shalu Gupta <sgupta5(at)unity(dot)ncsu(dot)edu> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Cc: | pgsql-hackers(at)postgresql(dot)org |
Subject: | TPC H data |
Date: | 2004-04-21 04:08:16 |
Message-ID: | Pine.GSO.4.58.0404210005190.29636@uni00du.unity.ncsu.edu |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general pgsql-hackers |
Hello,
We are trying to import the TPC-H data into postgresql using the COPY
command and for the larger files we get an error due to insufficient
memory space.
We are using a linux system with Postgresql-7.3.4
Is it that Postgresql cannot handle such large files or is there some
other possible reason.
Thanks
Shalu Gupta
NC State University.
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2004-04-21 04:14:43 | Re: [OT] Tom's/Marc's spam filters? |
Previous Message | Marc G. Fournier | 2004-04-21 03:48:57 | Re: [OT] Tom's/Marc's spam filters? |
From | Date | Subject | |
---|---|---|---|
Next Message | vinayj | 2004-04-21 04:13:33 | Is there any method to keep table in memory at startup |
Previous Message | Christopher Kings-Lynne | 2004-04-21 04:05:26 | Re: pg_autovacuum crashes when query fails for temp tables |