Re: postmaster segfaults with HUGE table

From: Simon Riggs <simon(at)2ndquadrant(dot)com>
To: pgsql-hackers(at)postgresql(dot)org
Cc: Neil Conway <neilc(at)samurai(dot)com>
Subject: Re: postmaster segfaults with HUGE table
Date: 2004-11-14 11:24:34
Message-ID: 1100431474.2950.166.camel@localhost.localdomain
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

On Sun, 2004-11-14 at 10:05, Neil Conway wrote:
> Joachim Wieland wrote:
> > this query makes postmaster (beta4) die with signal 11:
> >
> > (echo "CREATE TABLE footest(";
> > for i in `seq 0 66000`; do
> > echo "col$i int NOT NULL,";
> > done;
> > echo "PRIMARY KEY(col0));") | psql test
> >
> >
> > ERROR: tables can have at most 1600 columns
> > LOG: server process (PID 2140) was terminated by signal 11
> > LOG: terminating any other active server processes
> > LOG: all server processes terminated; reinitializing
>
> At best you're going to get the error message above: "tables can have at
> most 1600 columns". But this is definitely a bug: we end up triggering
> this assertion:
>
> TRAP: BadArgument("!(attributeNumber >= 1)", File: "tupdesc.c", Line: 405)
>
> This specific assertion is triggered because we represent attribute
> numbers throughout the code base as a (signed) int16 -- the assertion
> failure has occurred because an int16 has wrapped around due to
> overflow. A fix would be to add a check to DefineRelation() (or even
> earlier) to reject CREATE TABLEs with more than "MaxHeapAttributeNumber"
> columns. We eventually do perform this check in
> heap_create_with_catalog(), but by that point it is too late: various
> functions have been invoked that assume we're dealing with a sane number
> of columns.
>
> BTW, I noticed that there's an O(n^2) algorithm to check for duplicate
> column names in DefineRelation() -- with 60,000 column names that takes
> minutes to execute, but an inconsequential amount of time with 1500
> column names. So I don't think there's any point rewriting the algorithm
> to be faster -- provided we move the check for MaxHeapAttributeNumber
> previous to this loop, we should be fine.
>

This seems too obvious a problem to have caused a bug...presumably this
has been there for a while? Does this mean that we do not have
regression tests for each maximum setting ... i.e. are we missing a
whole class of tests in the regression tests?

--
Best Regards, Simon Riggs

In response to

Responses

Browse pgsql-hackers by date

  From Date Subject
Next Message Simon Riggs 2004-11-14 11:28:12 Re: code question: storing INTO relation
Previous Message Simon Riggs 2004-11-14 11:06:38 Re: code question: storing INTO relation