Skip site navigation (1) Skip section navigation (2)

One table or many tables for data set

From: "Castle, Lindsay" <lindsay(dot)castle(at)eds(dot)com>
To: pgsql-performance(at)postgresql(dot)org
Subject: One table or many tables for data set
Date: 2003-07-23 00:34:41
Message-ID: B09017B65BC1A54BB0B76202F63DDCCA0532489F@auntm201 (view raw or flat)
Thread:
Lists: pgsql-performance
Hi all,

I'm working on a project that has a data set of approximately 6million rows
with about 12,000 different elements, each element has 7 columns of data.

I'm wondering what would be faster from a scanning perspective (SELECT
statements with some calculations) for this type of set up;
	one table for all the data
	one table for each data element (12,000 tables)
	one table per subset of elements (eg all elements that start with
"a" in a table)

The data is static once its in the database, only new records are added on a
regular basis.

I'd like to run quite a few different formulated scans in the longer term so
having efficient scans is a high priority.

Can I do anything with Indexing to help with performance?  I suspect for the
majority of scans I will need to evaluate an outcome based on 4 or 5 of the
7 columns of data.

Thanks in advance :-)

Linz

Responses

pgsql-performance by date

Next:From: Reece HartDate: 2003-07-23 00:40:01
Subject: slow table updates
Previous:From: Jord TannerDate: 2003-07-22 19:18:52
Subject: Re: Dual Xeon + HW RAID question

Privacy Policy | About PostgreSQL
Copyright © 1996-2014 The PostgreSQL Global Development Group