I am in the process of creating a database design in which LOTS of data need
to be modelled.
For instance, I need to store data about products. Every product has LOTS of
properties, well over a hundred.
So I'm wondering. What's the best approach here, performance wise? Just
create one Product table with well over a hundred columns? Or would it be
better to divide this over more tables and link them together via ID's? I
could for instance create tables Product, PriceInfo, Logistics, StorageInfo,
PackagingInfo and link them together via the same ID. This would be easier
to document (try to visualize a 100+ column table in a document!), but would
it impact performance? I tihnk maybe it would impact Select performance, but
Updating of products would maybe speed up a little...
All info about a product is unique for this product so records in PriceInfo,
Logistics, StorageInfo, PackagingInfo tables would map one to one to records
in the Product table.
Do any of you know if and how PostgreSQL would prefer one approach over the
Thanks in advance,
In response to
pgsql-performance by date
|Next:||From: Oliver Scheit||Date: 2003-09-18 08:18:27|
|Subject: Re: Is there a reason _not_ to vacuum continuously?|
|Previous:||From: Christopher Kings-Lynne||Date: 2003-09-18 05:23:37|
|Subject: Re: rewrite in to exists?|