JDK-6345154 : JVM dumps core on Linux/x86 in app that uses NVidia Cg code
  • Type: Bug
  • Component: hotspot
  • Sub-Component: runtime
  • Affected Version: 5.0
  • Priority: P3
  • Status: Closed
  • Resolution: Not an Issue
  • OS: linux_redhat_3.0
  • CPU: x86
  • Submitted: 2005-11-02
  • Updated: 2010-08-19
  • Resolved: 2005-11-22
Related Reports
Relates :  
Relates :  
Description
The attached files contain a tarball that has the class files and libraries for a test case that uses NVidia's Cg library from Java when run on 32-bit Liunx using Java 1.5.0_01.  The test case fails with a SEGV that appears to be from an error in the JVM (see the comments for more detail).

To run the test case, start with a machine running 32-bit RHEL3 with an NVidia graphics card and recent NVidia graphics drivers. Unpack the tarball, set JAVA_HOME, cd to 'gpu' and run ./runTest.
The NVidia driver needs to be version 7664, available at:

http://www.nvidia.com/object/linux_display_ia32_1.0-7664.html

Newer drivers expose a problem with the test case which prevents reproducting the bug.

Also, the Cg libraries need to be installed from:

http://download.nvidia.com/developer/cg/Cg_1.4/Cg-1.4.0-4.i386.rpm

Comments
EVALUATION After some investigation it appears that NVidia's Cg library is changing the signal handlers out from under the JVM even though it appears from the error logs that it is not. Setting the environment variable setenv LD_PRELOAD /full_path_to_jdk/jre/lib/i386/libjsig.so resolves the crashes. This is not a bug in the JVM but well-defined behavior when interacting with libraries which use signal handlers: see e.g. http://java.sun.com/j2se/1.4.2/docs/guide/vm/signal-chaining.html .
22-11-2005