## How does Numeric division determine precision?

From: Will Pugh pgsql-sql(at)postgresql(dot)org How does Numeric division determine precision? 2012-07-12 16:02:03 CAM39vH_0snUv_YRCU11L2iB4zj=Dz7F9B_RhUPtYDqdSKymaDw@mail.gmail.com (view raw, whole thread or download thread mbox) 2012-07-12 16:02:03 from Will Pugh  2012-07-12 17:30:19 from Tom Lane pgsql-sql
```Hi,

I'm using Postgres 9.1, and wanted to understand how some of the
numeric operations work.

It seems that is 9.1, numerics that don't have a specified precision
and scale are arbitrary scale/precision.

For many operations this is straightforward.  However, when doing a
division operation that does not terminate, I'm curious about how the
number of digits is determined.

It seems like there is some minimum precision, e.g.
>select 1/3::numeric
0.33333333333333333333

However, when operating on numbers with larger precision:
>select .5353535353355353535353/74::numeric
0.0072345072342639912640

.5353535353355353535353 has 22 digits
.0072345072342639912640 also has 22 digits, but should the first two
0's after the decimal point count as "precision"?

If I then, do the same operation, but move the decimal point on the
divisor, I get a different amount of precision:
>select .5353535353355353535353/.0074::numeric
72.3450723426399126399054

.5353535353355353535353 still has 22 digits
72.3450723426399126399054 now has 24 digits

For the most part, this seems correct, but I'm interested in knowing
how you determine precision and scale for the result of a divide.  Is
there a well known algorithm?

Thanks,
--Will

```

### pgsql-sql by date

 Next: From: Tom Lane Date: 2012-07-12 17:30:19 Subject: Re: How does Numeric division determine precision? Previous: From: David Johnston Date: 2012-07-12 13:40:12 Subject: Re: Prevent double entries ... no simple unique index