Test failures on i386 in NormalDistributionTest - Redmine #1986
Archive from user: Nicholas Breen
Two tests in the basic ‘make check’ suite fail when compiling Debian packages from the 2016-beta2 tarball on the i386 (32-bit) architecture. These are from the single precision / non-MPI build; the test failures abort the build so I haven’t yet checked the double precision or MPI variants for the same issue. It fails on three different build systems with three different kernels (Linux, FreeBSD, and Hurd), so it’s unlikely to be a flaky machine. From the displayed output, though, it’s not immediately obvious to me what’s failing.
[----------] 4 tests from NormalDistributionTest
[ RUN ] NormalDistributionTest.Output
[ OK ] NormalDistributionTest.Output (0 ms)
[ RUN ] NormalDistributionTest.Logical
[ OK ] NormalDistributionTest.Logical (0 ms)
[ RUN ] NormalDistributionTest.Reset
/«PKGBUILDDIR»/src/gromacs/random/tests/normaldistribution.cpp:104: Failure
Value of: valB
Actual: -1.19571
Expected: valA
Which is: -1.19571
[ FAILED ] NormalDistributionTest.Reset (0 ms)
[ RUN ] NormalDistributionTest.AltParam
/«PKGBUILDDIR»/src/gromacs/random/tests/normaldistribution.cpp:120: Failure
Value of: distB(rngB, paramA)
Actual: -1.19571
Expected: distA(rngA)
Which is: -1.19571
[ FAILED ] NormalDistributionTest.AltParam (0 ms)
[----------] 4 tests from NormalDistributionTest (0 ms total)
Build logs:
https://buildd.debian.org/status/fetch.php?pkg=gromacs&arch=i386&ver=2016%7Ebeta2-1&stamp=1465406125
https://buildd.debian.org/status/fetch.php?pkg=gromacs&arch=hurd-i386&ver=2016%7Ebeta2-1&stamp=1465411500
https://buildd.debian.org/status/fetch.php?pkg=gromacs&arch=kfreebsd-i386&ver=2016%7Ebeta2-1&stamp=1465406606
(from redmine: issue id 1986, created on 2016-06-09 by gmxdefault, closed on 2016-07-08)
- Changesets:
- Revision fa1d62c6 by Erik Lindahl on 2016-07-08T09:08:31Z:
Work around compiler issue with random test
gcc-4.8.4 running on 32-bit Linux fails a few
tests for random distributions. This seems
to be caused by the compiler doing something
strange (that can lead to differences in the lsb)
when we do not use the result as floating-point
values, but rather do exact binary comparisions.
This is valid C++, and bad behaviour of the
compiler (IMHO), but technically it is not required
to produce bitwise identical results at high
optimization. However, by using floating-point
tests with zero ULP tolerance the problem
appears to go away.
Fixes #1986.
Change-Id: I252f37b46605424c02435af0fbf7a4f81b493eb8