Summary
-------
When dumping the heap in binary format, always use HPROF format 1.0.2. Previously, format 1.0.1 was used for heaps smaller than 2GB.
Problem
-------
```
public class BigArray {
public static void main(String[] args) throws Exception {
long[] b = new long[1024 * 1024 * 1024 / 2];
Object o = new Object();
synchronized(o) {
o.wait(60000);
}
}
}
```
If you run the above code, and use "jmap" to generate a dump file, then use jhat to parse the dump file, you will get a warning message:
"WARNING: Error reading heap dump or heap dump segment: Byte count is -4294967296 instead of 0"
Eclipse MAT also can't parse the dump file correctly.
Solution
--------
The root cause is that the length of the array exceeds the size of the u4 used to hold the heap dump segment length.
The solution is to always use HPROF format 1.0.2's segmented dump format and to truncate to 2gb arrays that are larger than that.
Webrev: http://cr.openjdk.java.net/~ddong/8144732/hotspot.02.
Specification
-------------
The HPROF heap dump file header always contains ""JAVA PROFILE 1.0.2" and never contains "JAVA PROFILE 1.0.1".
The JVM parameter "SegmentedHeapDumpThreshold" is deprecated.