From: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
---|---|
To: | Peter Eisentraut <peter_e(at)gmx(dot)net> |
Cc: | Casey Allen Shobe <cshobe(at)secureworks(dot)net>, pgsql-bugs(at)postgreSQL(dot)org |
Subject: | Re: Postgres storing time in strange manner |
Date: | 2002-09-17 06:01:32 |
Message-ID: | 21086.1032242492@sss.pgh.pa.us |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-bugs pgsql-novice |
Peter Eisentraut <peter_e(at)gmx(dot)net> writes:
> If the test doesn't use any library function's run-time behavior, you can
> usually do something like
> main() {
> int a[(2.0+2.0==4.0)?1:-1]
> }
> This will fail to compile if the floating-point arithmetic is broken.
However, unless gcc itself is compiled with -ffast-math, such an
approach won't show up the bug.
I had success with this test:
#include <stdio.h>
double d18000 = 18000.0;
main() {
int d = d18000 / 3600;
printf("18000.0 / 3600 = %d\n", d);
return 0;
}
Using Red Hat 7.2's compiler:
[tgl(at)rh1 tgl]$ gcc -v
Reading specs from /usr/lib/gcc-lib/i386-redhat-linux/2.96/specs
gcc version 2.96 20000731 (Red Hat Linux 7.1 2.96-98)
I get:
[tgl(at)rh1 tgl]$ gcc bug.c
[tgl(at)rh1 tgl]$ ./a.out
18000.0 / 3600 = 5 -- right
[tgl(at)rh1 tgl]$ gcc -ffast-math bug.c
[tgl(at)rh1 tgl]$ ./a.out
18000.0 / 3600 = 4 -- wrong!
You need the dummy global variable to keep the compiler from simplifying
the division at compile time, else you get 5. With the test as
exhibited, the -O level seems not to matter.
regards, tom lane
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2002-09-17 06:07:31 | Re: Postgres storing time in strange manner |
Previous Message | Tom Lane | 2002-09-17 05:15:30 | Re: [BUGS] Postgres storing time in strange manner |
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2002-09-17 06:07:31 | Re: Postgres storing time in strange manner |
Previous Message | Tom Lane | 2002-09-17 05:15:30 | Re: [BUGS] Postgres storing time in strange manner |