FULL PRODUCT VERSION :
any 1.6beta client VM
FULL OS VERSION :
WindowsXP ver 2002 SP2
A DESCRIPTION OF THE PROBLEM :
I am working on a Sega Genesis Emulator and after seeing it fail on any 1.6beta client VM, i made some tests and found the first point, where something is going wrong. I use the following code to sign extend 16bit values in my CPU emulator:
for example:
int EffectiveAddress = ((short)(memory.fetchOpWord(PC)&0xFFFF));
At a certain time fetchOpWord returns 0xF62A, for example. In any VM except the 1.6client this leads to EffectiveAddress = 0xFFFFF62A, while in the 1.6clients it becomes EffectiveAddress = 0xF62A.
So it doesnt get sign extended.
This always happens at this point and by the behavior i can see during emulation, i am sure the same happens in other locations as well.
It does not happen using the -Xint option, nor in the server VM.
I suspect the reason is "&0xFFFF" gets executed AFTER the cast, because leaving it out leads to the correct result.
THE PROBLEM WAS REPRODUCIBLE WITH -Xint FLAG: No
THE PROBLEM WAS REPRODUCIBLE WITH -server FLAG: No
EXPECTED VERSUS ACTUAL BEHAVIOR :
Expected:
((short)(0xF000 & 0xFFFF)) = 0xFFFFF000;
Actual:
((short)(0xF000 & 0xFFFF)) = 0xF000;
REPRODUCIBILITY :
This bug can be reproduced always.
---------- BEGIN SOURCE ----------
void test(){
int i, ea;
for(i=0; i<=0xFFFF; i++){
ea = ((short)(i&0xFFFF));
System.out.println( Integer.toHexString(i) + " : " + Integer.toHexString(ea) );
}
}
---------- END SOURCE ----------