A DESCRIPTION OF THE REQUEST :
Rendering to a remote X11 display can be slow. Especially when anti-aliasing and alpha are involved. For examples, see bugs 4488401 and 4723006.
It's easy enough for us as developers to implement two different paths, one expensive and fancy for local displays; the other cheap and nasty for remote displays. But without a way to easily detect which situation we're in, the user has to manually configure the application to get the best behavior.
Without such an enhancement, there's no way to automatically do the right thing.
EXPECTED VERSUS ACTUAL BEHAVIOR :
I'd like to be able to tell from the GraphicsEnvironment whether I'm dealing with a remote X11 display. (Potentially, it would be interesting to know if I was on criminally bad local hardware, too. But remote X11 displays are the only problem I've had in real life.)
As far as I can tell, an X11GraphicsEnvironment looks the same whether it's local or remote. And there's certainly no public intention-revealing API for telling the difference.
CUSTOMER SUBMITTED WORKAROUND :
The best I've come up with is forcing the user to manually disable fancy visual effects.
In theory, I guess we could time stuff, but that's difficult because the obvious point to try to detect our situation is start-up where performance is weird anyway because the VM's still loading/compiling stuff as it goes.