Re: Optimizing numeric SUM() aggregate

From: Heikki Linnakangas <hlinnaka(at)iki(dot)fi>
To: Andrey Borodin <amborodin(at)acm(dot)org>, pgsql-hackers(at)postgresql(dot)org
Subject: Re: Optimizing numeric SUM() aggregate
Date: 2016-09-02 08:52:49
Views: Raw Message | Whole Thread | Download mbox | Resend email
Lists: pgsql-hackers

On 08/03/2016 08:47 PM, Andrey Borodin wrote:
> 1. Currenlty overflow is carried every 9999 addition. I suggest that
> it is possibe to do it only every (INT32_MAX - INT32_MAX / NBASE) /
> (NBASE - 1) addition. Please note that with this overflow count it
> will be neccesary to check whether two finishing carrings are
> required.

I left it at (NBASE -1), but added a comment on that. I couldn't measure
any performance difference between that and a larger interval, so it
didn't seem worthwhile. The code wouldn't be much more complicated, but
the math required to explain what the safe maximum is, would be :-).

> 2. Aggregates and numeric regression tests do not containt
> any test case covering overflows. I recommend adding sum with numer 1
> 000 000 of 99 999 999 values. May be you will come up with more
> clever overflow testing.

Hmm. Added a couple of simple SUM() queries to the regression tests, so
that we have at least some test coverage of the overflows. But nothing
very clever.

Looking closer, I don't think this implementation is actually dependent
on NBASE being 10000. So I removed the comment changes that suggested so.

I did some more minor comment fixes, and committed. Thanks for the review!

- Heikki

In response to

Browse pgsql-hackers by date

  From Date Subject
Next Message Robert Haas 2016-09-02 09:01:28 Re: less expensive pg_buffercache on big shmem
Previous Message Petr Jelinek 2016-09-02 08:50:17 Re: Logical decoding slots can go backwards when used from SQL, docs are wrong