full table...

From: Felson <felson123(at)yahoo(dot)com>
To: pgsql-novice(at)postgresql(dot)org
Subject: full table...
Date: 2002-08-19 23:56:53
Message-ID: 20020819235653.12874.qmail@web13003.mail.yahoo.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-novice

I have a table that stores a HUGE volume of data every
day. I am now running into a problem where when I try
to insert data, the remote connection times out
because it takes to long... (1 minute)

The solution I was going to try, is to break up the
table it to one per upload site. (tablename_siteid) as
the table name. This could result in there being
posably 1000 or more tables in about a years time. Can
this cause me any problems? If so, what would be a
better answer?

current structer:

Table "channeldata"
Attribute | Type | Modifier

------------+-----------+-----------------------
id | integer | not null default nextval
cd_id | integer | default 0
s_id | integer | default 0
units | smallint | default 0
datareal | float8 |
dataalt | float8 |
dataoffset | float8 | default 0
tstamp | timestamp | default now()

proposed structer:
Table "channeldata_[s_id]"
Attribute | Type | Modifier

------------+-----------+-----------------------
id | integer | not null default nextval
cd_id | integer | default 0
units | smallint | default 0
datareal | float8 |
dataalt | float8 |
dataoffset | float8 | default 0
tstamp | timestamp | default now()

__________________________________________________
Do You Yahoo!?
HotJobs - Search Thousands of New Jobs
http://www.hotjobs.com

Responses

Browse pgsql-novice by date

  From Date Subject
Next Message Tom Lane 2002-08-20 02:53:09 Re: full table...
Previous Message John Gunther 2002-08-19 23:26:16 Re: Stored Procedures