Sometimes users need to classify an integral value into a series of ranges, much as a switch statement classifies an integer according to a series of points.
The server JIT has internal logic to transform such a computation into a search of a balanced binary tree, but it only uses it for switch bytecodes. It should also perform this transformation for general nests of ifs and switches, where those nests are all testing the same integral value.
This would allow them to write integer classification code more idiomatically and naturally; without hand optimizing control flow into search trees. There are two more advantages beyond avoiding hand optimization: First, the translation could be tuned using on-line profile data (branch and/or value profiling), to avoid testing for values that do not occur frequently in practice. Second, if the if/switch transformation is performed after loop transformations, then a textually simple loop (in source code) over the classification critical points can be transformed into a finely tuned binary search (in object code).
Internally, the server JIT transforms switch statements into a series of ranges and then (unless the points are compact enough to merit a jump table) emits IR to perform a binary search, classifying the input. Note that case labels for adjacent values are merged, if they label the same statement. This means that even today some switches are transformed into interval tests.
Here is an example of a conversation where users are hand optimizing code that should be optimized automatically:
http://mail.openjdk.java.net/pipermail/core-libs-dev/2015-November/036766.html
The critical points of that classification are powers of ten, through the dynamic range of an int or long. They are most naturally produced by a short "for" loop, which should be unrolled to a series of ifs and then rebalanced.