Re: Crash in postgres/linux on verly large database

From: "Gregory S(dot) Williamson" <gsw(at)globexplorer(dot)com>
To: "Bernhard Ankenbrand" <b(dot)ankenbrand(at)media-one(dot)de>, <pgsql-general(at)postgresql(dot)org>
Subject: Re: Crash in postgres/linux on verly large database
Date: 2004-04-06 23:36:11
Message-ID: 71E37EF6B7DCC1499CEA0316A2568328DC9B95@loki.wc.globexplorer.net
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Different file system but we've loaded 39,000,000 row tables (about 30 gigs of spatial data) without any issues ... CPU load was 1.something for a while but not problems with loading, indexing or doing the stats command. I might suspect the underlying OS ...

Greg Williamson
DBA
GlobeXplorer LLC

-----Original Message-----
From: Bernhard Ankenbrand [mailto:b(dot)ankenbrand(at)media-one(dot)de]
Sent: Tuesday, April 06, 2004 4:22 AM
To: pgsql-general(at)postgresql(dot)org
Subject: [GENERAL] Crash in postgres/linux on verly large database

Hi,

we have a table width about 60.000.000 entrys and about 4GB storage size.
When creating an index on this table the whole linux box freezes and the
reiser-fs file system is corrupted on not recoverable.

Does anybody have experience with this amount of data in postgres 7.4.2?
Is there a limit anywhere?

Thanks

Bernhard Ankenbrand

---------------------------(end of broadcast)---------------------------
TIP 8: explain analyze is your friend

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Daniel A. Steffen 2004-04-07 01:04:20 Re: [MACTCL] Tcl load command and mac os x
Previous Message Gavin M. Roy 2004-04-06 23:08:25 Re: Large DB