In Apache Derby there is some code that performs normalization of floating point values. The results from this normalization tend to vary, although I believe the results should be well-defined. The code is essentially like this: // turn negative zero into positive zero if (v == 0.0f) v = 0.0f; return v; Since (-0.0f == 0.0f) evaluates to true per the language specification, v should be assigned the value 0.0f (positive zero) if the original value of v is 0.0f or -0.0f (negative zero). Hence, negative zero should never be returned by this code. However, it sometimes does return negative zero. It looks like this happens when the runtime optimizer has kicked in. It probably thinks the if statement is without side-effects and optimizes it away. To reproduce, compile and run this Java class: public class Normalize { public static void main(String[] args) { System.out.println("normalize(-0.0f): " + normalize(-0.0f)); for (int i = 0; i < 1000000; i++) { normalize(-0.0f); } System.out.println("normalize(-0.0f): " + normalize(-0.0f)); } public static float normalize(float v) { if (v == 0.0f) v = 0.0f; return v; } } When I run this code with the server VM, I get this output most of the time: $ java -server Normalize normalize(-0.0f): 0.0 normalize(-0.0f): -0.0 That is, the first time normalize(-0.0f) is called, it returns positive zero as expected. The last time normalize(-0.0f) is called, it returns negative zero. Occationally with the server VM, and every time with the client VM or with -Xint, I see this output (which is what I expect to see): $ java -client Normalize normalize(-0.0f): 0.0 normalize(-0.0f): 0.0
|