On lör, 2011-06-04 at 07:09 +0000, dinesh wrote:
> I have a table which is used during data uploads, a so-called staging table.
> This table has a fixed number of columns that [must] match the input CSV
> file. This CSV file is uploaded using COPY command. Following the COPY, a
> new column (meant for indexing) is constructed on this table using some
> application logic; and dropped after that data upload cycle is over.
> After some 1500+ cycles, I get the following error:
> ERROR: tables can have at most 1600 columns
> SQL state: 54011
> Context: SQL statement "ALTER TABLE stage_fo ADD COLUMN exch_ticker char
> So it appears that the command
> ALTER TABLE stage_fo DROP COLUMN exch_ticker
> is only producing some soft effects, not sufficient for the db engine.
Yeah, a DROP COLUMN only hides the column and gradually phases it out
from the storage, it doesn't remove it from the metadata storage, which
is where the 1600 limit applies. So that application design is not
going to work.
I find it a bit suspicious that you add a column "for indexing", when
you could perhaps be using an expression index.
In response to
pgsql-bugs by date
|Next:||From: Matthijs Bomhoff||Date: 2011-06-06 14:00:52|
|Subject: MACADDR parsing issues|
|Previous:||From: Regina||Date: 2011-06-05 19:22:07|
|Subject: BUG #6053: Can't do DISTINCT on citext column|