Re: Inconsistent behavior on Array & Is Null?

From: Greg Stark <gsstark(at)mit(dot)edu>
To: Joe Conway <mail(at)joeconway(dot)com>
Cc: Greg Stark <gsstark(at)mit(dot)edu>, pgsql-hackers(at)postgresql(dot)org
Subject: Re: Inconsistent behavior on Array & Is Null?
Date: 2004-04-03 00:02:11
Message-ID: 87n05uj5do.fsf@stark.xeocode.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers


Joe Conway <mail(at)joeconway(dot)com> writes:

> Greg Stark wrote:
> > array_lower() and array_upper() are returning NULL for a non-null input, the
> > empty array, even though lower and upper bounds are known just as well as they
> > are for any other sized array. They are behaving as if there's something
> > unknown about the empty array that makes it hard to provide a lower bound or
> > upper bound.
>
> Sorry, but I still disagree. There *is* something unknown about the lower and
> upper bound of an empty array because there are no bounds.

So I understand your point of view now. But I think the intuitive meaning here
for lower/upper bound as the lowest/highest index where an element is present
is only a part of the picture.

lower and upper bound are also related to other properties like where
array_prepend and array_append place things. And of course the array length.

So to give a practical example, say I was implementing a stack using an array.
I push things on by extending the array by storing in arr[array_upper(arr)+1].
(Incidentally I don't think we actually have any way to shorten an array, do
we?) As soon as I pop off the last element I lose my stack pointer. I need a
special case in my code to handle pushing elements in when the array is empty.

In reality array_append() would work fine. It's only array_upper() that's
going out of its way to make things weird. There's still an upper bound,
array_append() knows it, array_upper() just hides this value from the user.

> I don't see the spec defined CARDINALITY as a workaround. It defines length as
> the number of elements in the array. When the array is empty, that value is
> clearly 0. Nothing strange about it.

The workaround is having to have that case handled with a special case if
statement. If array_lower(), array_upper(), array_length()/CARDINALITY are all
defined in a consistent way it doesn't seem like there ought to be any special
cases in the implementations. There should be a simple rigid mathematical
relationship between them. namely "upper-lower+1 = length"

> > test=# select array_upper(a||b, 1), array_upper(a,1)+array_length(b) from (select '{}'::int[] as a, array[1,2] as b) as x;
> > array_upper | ?column? -------------+----------
> > 2 | (1 row)
>
> OK, you got me with this corner case. But using what you described as the
> result int_aggregate would give you in this case (-1), you would get an even
> stranger answer (-1 + 2 = 1) that would still need to be worked around.

No actually, 1 would be the correct answer, the original array would have
indexes ranging from [0,-1] and the new array would have indexes ranging from
[0,1], ie, two elements. The only strangeness is the unusual lower bound which
isn't the default for postgres arrays constructed from string literals.
Personally I prefer the zero-based indexes but as neither SQL-foo nor
backwards compatibility agree with me here I'll give that up as a lost cause
:)

--
greg

In response to

Browse pgsql-hackers by date

  From Date Subject
Next Message Jim Seymour 2004-04-03 00:08:54 Re: Problems Vacuum'ing
Previous Message Greg Stark 2004-04-02 23:41:43 Re: Inconsistent behavior on Array & Is Null?