JDK-4366799 : colormap sometimes wrong in 8-bit display mode (win32 only)
  • Type: Bug
  • Component: client-libs
  • Sub-Component: 2d
  • Affected Version: 1.4.0
  • Priority: P3
  • Status: Resolved
  • Resolution: Fixed
  • OS: windows_nt
  • CPU: x86
  • Submitted: 2000-08-29
  • Updated: 2001-03-26
  • Resolved: 2001-03-26
The Version table provides details related to the release that this issue/RFE will be addressed.

Unresolved : Release in which this issue/RFE will be addressed.
Resolved: Release in which this issue/RFE has been resolved.
Fixed : Release in which this issue/RFE has been fixed. The release containing this fix may be available for download as an Early Access Release or a General Availability Release.

To download the current JDK release, click here.
Other
1.4.0 betaFixed
Related Reports
Relates :  
Relates :  
Relates :  
Relates :  
Description
Run VolatileDuke in 16 or 32-bit mode.  Switch the display depth to 8-bit.
Note that the Duke image is now all black.

Note that the application runs correctly when you start in 8-bit mode,
even if you switch to another depth and then return to 8-bit mode.  So
I suspect there is some colormap initialization that is happening when
starting in 8-bit mode but is not happening when we switch to that mode.

-----------------


I had originally thought that this problem was confined to VolatileImage
objects, such as are used in VolatileDuke.  Also, it seemed that the problem
only resulted when the display mode switched into 8-bit mode from some
other mode.

However, the problem is much more far-reaching than that.  The problem
can be reproduced by a simple app that draws text directly to the onscreen
window.  In 8-bit mode, simply toggle between that app and some other 
app that has its own palette (such as IE or Netscape); you should see
wrong colors being used in various drawing primitives, such as text.

I will attach a simple test program that shows off this bug fairly well.
The test is VImageColors.  It consists of one onscreen window with
three sections.  The left thrid is drawn using a VolatileImage back buffer.
The middle pane uses a BufferedImage back buffer.  And the right pane
is drawn directly to the screen.  All three panes use the same image, 
same text color, and same fill color, so they should all look the same
(modulo some dithering differences in some situations).  

The first problem noticeable is that the right panel uses the wrong
color for the fill and the text.  This is an unrelated problem that I have
filed as 4425895.

The second problem is noticeable when toggling
the app with some other palettized app; it is clear there are various rendering
bugs preventing the colors from being correct when the app is in the
background. (the fill color is wrong in some panels, the duke image is
either badly dithered or simply wrong, and the text color is wrong in
some panels).

A third problem is seen when you start the app in the background
(run the program and then immediately bring a palettized app to the
foreground.  When the window eventually comes up, it should be in the 
background.)  Now the colors look wrong in some of the panels both when
the app is in the background and in the foreground.


Comments
CONVERTED DATA BugTraq+ Release Management Values COMMIT TO FIX: merlin-beta FIXED IN: merlin-beta INTEGRATED IN: merlin-beta
14-06-2004

EVALUATION There is bad logic in awt_Toolkit's adjustColormap() function which prevents an inverse LUT from being calculated when the display mode is switched to 8-bits. If the display starts in 8-bits, the logic works fine, but otherwise the LUT will never be calculated. The logic works like this: - if bitmode != 8 bits, return without calculating anything - if the function is called with a NULL DC, then calculate the LUT and assign intelligent values to m_palEntryCount and m_palSeed and put the colortable in m_palEntries and the inverse LUT in m_inverseDesktopLUT. - if the function is called with a non-NULL DC, then calculate the LUT and assign an intelligent value to m_palEntryOffscreenCount, put the colortable in m_palOffscreenEntries and the LUT in m_inverseOffscreenLUT. The function seems to only be called with a null DC in the constructor of AwtToolkit. So in the case where the user starts the application when the display mode is != 8-bits, then the logic above will prevent the colortable, LUT, and m_pal* values from ever being calculated. The fix may be as simple as calling adjustPalette(NULL) we processing the WM_DISPLAYCHANGE event. We currently call the function with a valid DC, but we could also call it with NULL to make sure that we calculate all values appropriately when the new mode is 8-bits. The right solution should probably take a look at how we're dealing with colormaps overall and try to extract some of the intertwined logic in this (and other) functions. For example, why do we have a single function with such different uses? Shouldn't we have a separate function for calculating the desktop values and the DC values? -------------------------------------- I tried the suggested quick-fix mentioned above (call adjustColormap(NULL) in the WM_DISPLAYCHANGE event) and it appears to work. The better solution (reworking our processing of colormaps in general) is being done by bchristi in conjuntion with some multimon work. My quick fix will at least allow things to work until Brent's work is putback. chet.haase@Eng 2001-01-17 I'm putting back this fix now. However, there is still work to be done on this general issue, such as forcing offscreen images to use the current colormap when drawing themselves and forcing them to redraw themselves when the palette changes. I'm leaving the bug open pending these other changes. chet.haase@Eng 2001-02-06 ------------------------- The problems described in the updated Description (seen in VImageColors) are attributable to the same root cause: our use of a static IndexColorModel for our color conversions during rendering. The problem is that we create our initial images and onscreen windows with an associated ColorModel. This model is always built from our ideal custom palette. This ColorModel is used for translating rgb values into color indices, which is done whenever a drawing color is selected in the Graphics object and when a color-converting Blit is performed. In the case of BufferedImage objects, this is a valid approach; the ColorModel never changes, the rendering to that image is always correct given that ColorModel. But in the case of onscreen windows and VolatileImage objects (both of which are at the whims of display device settings), a static ColorModel object no longer makes sense. If we use the old ColorModel to derive our index colors for these surfaces, we will calculate indices which may have some completely different color stored in them on the actual display device. Instead, these objects need to have a dynamic colorModel object which is up-to-date with the current display settings. So, for example, when another palettized app is brought to the foreground, we should update the ColorModel associated with surfaces (on and offscreen) that use our palette. There were various places in the code where changes needed to be made to accomodate this fix, but the major changes boiled down to the following: - Onscreen windows and VolatileImage objects (as well as hidden-acceleration objects, since they, too, may live in device memory) now get a new, dynamic colorModel (one per GraphicsConfiguration) called the deviceColorModel. - The deviceColorModel is updated whenever there has been a palette change. Specifically, when we receive a WM_PALETTECHANGED event, if the new colors in the display device do not match the old colors, then we need to update our dynamic colorModel. - Since VolatileImage objects might be rendered once to some buffer and then never touched again, they may have incorrect color index values after a palette change. These images must be re-rendered. This happens by forcing a "contentsLost" situation on the volatileImages. Similarly, hidden acceleration surfaces are re-copied into the hardware cache as appropriate. - Make sure that all of our native surface code is always getting the updated copy of the colorModel data. Some code in Win32SurfaceData and Win32OffScreenSurfaceData was incorrect and resulted in using old palette data even after we switched to a dynamic colorModel approach. chet.haase@Eng 2001-03-14 ------------------------- A couple more important fixes were necessary to make this work with display mode switches (switching from one display depth to another): - When we are recreating the hw surfaceData, we need to grab the colorModel from the Component's GC, not the GC we were created with. Otherwise, we simply get the colorModel from the last display depth and the recreated hw surfaces look horrible (not just in 8-bit mode). - There appears to be a Windows bug that prevents a WM_PALETTECHANGED message from being propagated to our component after a display switch. So we end up calculating an indexColorModel using some default system palette and we never update that colorModel. meanwhile, the system palette is updated by some app, so the indices we use are based on an obsolete palette and the colors look horrible. The fix for this is to now trap the WM_PALETTEISCHANGING event (which does get sent in this situation, surprisingly) and force our WmPaletteChanged() method to be called after a displayChange has occurred. These further fixes have been applied and everything appears to work both during palette changes in 8-bit mode and during display mode switches between all depths. chet.haase@Eng 2001-03-16
16-03-2001