United StatesChange Country, Oracle Worldwide Web Sites Communities I am a... I want to...
JDK-4780022 : REGRESSION: Swing line quality degraded from mantis b06 to b07 on Windows

Submit Date:
Updated Date:
Project Name:
Resolved Date:
Affected Versions:
Fixed Versions:
1.4.2 (b12)

Related Reports

Sub Tasks

The fix to 4693161 (clipped line performance) introduced a regression in line quality.

Basically, we now perform line clipping in integer coordinates at too high a level.  This lack of precision causes a difference between how we render clipped 
versus unclipped lines.

Here's a beautiful ASCII representation of what is happening:

Supposed we have a line that spans 10 pixels in the x direction and 1 in the
y direction:


Most line-drawing algorithms would step down one pixel at the half-way point.

Now, suppose we have a user clip at x=2.  The result should be a line that
begins 2 pixels over from the previous line, but is otherwise identical:


What we actually get through our newly clipped d3d lines is the following:


So we now drop down one in y one pixel over from where we did prior to clipping our d3d lines.

I've attached a simple app that shows the problem clearly.  Run it on the latest
jdk1.4.2 (b7 or later) and note the following:
- The Blue line is painted first.  The Black line represents the Clip
boundary.  The red line is painted over the blue line.
- If the bug does not exist, you should see the blue line to the left of
the black line and the red line to the right of the black line (completely
hiding the blue line).
- When the bug is present, you will see the blue line, the red line, and
also more blue about halfway across the red line.  This extra blue is the error
introduced in the new clipping algorithm.
- For a cooler, flashier version of the app run "java LineClipError -dynamic",
which scrolls the clip to the left and right; observe that the error (the amount of blue that shouldn't be there) increases and decreases with the position of the clip.



BugTraq+ Release Management Values





run with d3d disabled:
	java -Dsun.java2d.d3d=false <app>

The problem is that we're clipping in integer coordinates without
taking into account the Bresenham error/step terms.

For instance, take the example in the Description.  The original line
was from (0, y) to (10, y+1).  When we clipped the line at x=2, we essentially
told d3d to draw a line from (2, y) to (10, y+1).  The poitn at which we stepped
down to (y+1) on the first line was at x=5.  But the bresenham algorithm for
the second line would have us step down at x=6 (halfway between the start and
end points).

The fix here is to take advantage of the fact that d3d uses floating point
coordinates for its lines and clip in float coordinates instead of 
integer.  So instead of clipping the line above at (2, y), we would clip
it at (2, y+.2), which is essentially where the line would be in sub-pixel coordinates if drawn from the original point of (0, y).

###@###.### 2002-11-15

The "clip in float coordinates" worked fine except for one minor detail: the d3d
hardware I tested with (actually both an nVidia card and an ATI radeon)
used such small floating-point precision, that there were many pixelization
errors (clipped lines drawn with different pixels than unclipped).  So we had
to take a different approach.

We now enable d3d clipping and set up the viewport appropriately every time
we draw a line (or rectangle).  This allows us to draw with the unclipped
coordinates, which means that the d3d hardware does the same setup for 
clipped and unclipped lines and thus ends up with the same pixels covered
inside the clipped area.

###@###.### 2002-12-03

this last attempt worked fine except ... some cards end up with the same
line-clipping artifacts with viewport clipping that I saw when I tried
integer clipping (which resulted in this bug) and floating-point clipping (which had the problems described above).  

Another approach I tried involved using the d3d clip planes, but these appear to be ignored completely for screen-space primitives.

The final result was this:
- handle clipping via viewport clipping, as described above
- beef-up the runtime that that determines whether or not to use d3d to 
render lines.  now this test includes a clipping test that checks whether
a clipped line is drawn int he same pixel path as the unclipped line.
Specifically, we draw a nearly-horizontal line:
And then redraw it several times, clipping on the left further each time.
We should end up with the same pixels covered after all of these lines, but
we we end up with on some hardware is something more like this:
(note the overlap of pixels in the first row).
- If the test passes d3d lines but fails clipped lines, we simply disable 
d3d line clipping on this device and end up handling diagonal clipped lines
through the same mechanism as jdk1.4.1. 
- If line clipping passes the test, we enable line clipping via d3d and all
lines are rendered through d3d (barring any other problems that crop up).

The end result of these changes are:
- Performance is similar to jdk1.4.1 on platforms that do not do correct clipping
- Performance is much improved jdk1.4.2 vs. jdk1.4.2) on platforms that 
can handle correct clipping
- Quality is the same on both releases (and should be similar to the line
quality of our own software loops).

Note that we should work on an advanced approach to line drawing in the future that should handle line clipping on all d3d hardware, using a mask (either zbuffer or stencil buffer).  That fix is beyond the scope of jdk1.4.2, but would be a reasonable performance feature for a future release.

###@###.### 2002-12-19

Hardware and Software, Engineered to Work Together