The Dgesv.dgesv routine from JLAPACK 0.6 doesn't work
consistently correctly under IBM's Linux JRE with the
$ java -version
java version "1.3.0"
Java(TM) 2 Runtime Environment, Standard Edition (build
Classic VM (build 1.3.0, J2RE 1.3.0 IBM build
cx130-20010626 (JIT enabled: jitc))
Apparently it is a bad interaction with the JIT
compiler because the routine gives a correct answer the
first time it is called but eventually starts giving
incorrect answers. I have appended a small test
program written in Jython that demonstrates the
problem. The program solves a 3x3 matrix equation
10001 times using Dgesv.dgesv and prints the results of
the first and last attempts. The answers should be
identical but are not. The first result is correct.
$ jython jlapack_test.py
Result first attempt: 0
v = -0.08293070351379606
v = 0.09266920225961406
v = -0.001722128614625683
Result 10000th attempt: 0
v = 22.200877727061638
v = 1749.617616670161
v = 1.302740822326132
For this project, turning off the JIT is not an
option. Nor is switching to another JRE. Therefore
our ability to use JLAPACK depends on finding a way
around this bug.
Let me know if there is any more information that I can
give you to help diagnose this problem.
Log in to post a comment.