I'm using Postgres 9.1, and wanted to understand how some of the
numeric operations work.
It seems that is 9.1, numerics that don't have a specified precision
and scale are arbitrary scale/precision.
For many operations this is straightforward. However, when doing a
division operation that does not terminate, I'm curious about how the
number of digits is determined.
It seems like there is some minimum precision, e.g.
However, when operating on numbers with larger precision:
.5353535353355353535353 has 22 digits
.0072345072342639912640 also has 22 digits, but should the first two
0's after the decimal point count as "precision"?
If I then, do the same operation, but move the decimal point on the
divisor, I get a different amount of precision:
.5353535353355353535353 still has 22 digits
72.3450723426399126399054 now has 24 digits
For the most part, this seems correct, but I'm interested in knowing
how you determine precision and scale for the result of a divide. Is
there a well known algorithm?
pgsql-sql by date
|Next:||From: Tom Lane||Date: 2012-07-12 17:30:19|
|Subject: Re: How does Numeric division determine precision?|
|Previous:||From: David Johnston||Date: 2012-07-12 13:40:12|
|Subject: Re: Prevent double entries ... no simple unique index|