JDK-6799693 : Server compiler leads to data corruption when expression throws an Exception
  • Type: Bug
  • Component: hotspot
  • Sub-Component: compiler
  • Affected Version: hs14
  • Priority: P2
  • Status: Closed
  • Resolution: Fixed
  • OS: generic
  • CPU: generic
  • Submitted: 2009-01-30
  • Updated: 2011-03-08
  • Resolved: 2011-03-08
The Version table provides details related to the release that this issue/RFE will be addressed.

Unresolved : Release in which this issue/RFE will be addressed.
Resolved: Release in which this issue/RFE has been resolved.
Fixed : Release in which this issue/RFE has been fixed. The release containing this fix may be available for download as an Early Access Release or a General Availabitlity Release.

To download the current JDK release, click here.
JDK 6 JDK 7 Other
6u18Fixed 7Fixed hs15Fixed
Related Reports
Duplicate :  
Relates :  
The following test case results in different results when running in -Xcomp and -Xint or -Xmixed modes.

> java -server -Xint Tester
Got java.lang.NegativeArraySizeException
Tester.var_bad = 2 (expected 2)

> java -server -Xmixed Tester
Got java.lang.NegativeArraySizeException
Tester.var_bad = 2 (expected 2)

> java -server -Xcomp -XX:CompileOnly=Tester Tester
Got java.lang.NegativeArraySizeException
Tester.var_bad = 1 (expected 2)

=== Tester.java ===

public class Tester {
   static int var_bad = 1;

   public static void main(String[] args)

      try {
         for (int i = 0; i < 10; i++) (new byte[((byte)-1 << i)])[0]  = 0;
      catch (Exception e) { System.out.println("Got " + e); }

      System.out.println("Tester.var_bad = " +  var_bad + " (expected 2)\n");


The problem is server compiler specific (32 and 64 bit).

PUBLIC COMMENTS Problem: Both 6799693 and 6795161 bugs has the same cause. A store can flow below the call on the slow path of an allocation since an allocation consumes only raw memory slice. As result when exception happened during the allocation call the value of store's memory slice is incorrect. Solution: Use merged memory state for an allocation's slow path and raw memory for the fast path.

EVALUATION http://hg.openjdk.java.net/jdk7/hotspot-comp/hotspot/rev/7fe62bb75bf4

EVALUATION The fix for 6711100 improved our ability to reason about negative array lengths which results in the normal allocation diamond collapsing completely. The memory uses on the fast path side by the InitializeNode forced the evaluation of the memory effects before the allocation but once that side collapsed only raw memory was being consumed so the stores get lost. The suggested fix of Node *mem; if (C->do_escape_analysis()) { mem = reset_memory(); set_all_memory(mem); } else { //mem = memory(Compile::AliasIdxRaw); mem = merged_memory(); } seems reasonable though I'm unclear why escape analysis being on or off should effect how memory is handled here. Can't it simply always be merged_memory()? This will only occur with constant negative array sizes so it's unlikely to affect real programs.