JDK-8227434 : G1 predictions may over/underflow with high variance input
  • Type: Bug
  • Component: hotspot
  • Sub-Component: gc
  • Affected Version: 14
  • Priority: P3
  • Status: Resolved
  • Resolution: Fixed
  • Submitted: 2019-07-09
  • Updated: 2019-12-10
  • Resolved: 2019-11-29
The Version table provides details related to the release that this issue/RFE will be addressed.

Unresolved : Release in which this issue/RFE will be addressed.
Resolved: Release in which this issue/RFE has been resolved.
Fixed : Release in which this issue/RFE has been fixed. The release containing this fix may be available for download as an Early Access Release or a General Availability Release.

To download the current JDK release, click here.
JDK 14
14 b26Fixed
Related Reports
Blocks :  
Description
Currently G1 uses a predictor based on a decaying average that adds a safety margin using the sequences' variance/standard deviation.

On workloads/situations with recent high variance, the predictors used in the G1Analytics class may currently return unexpected negative/positive values as they are not clamped to a useful range.

As usual these errors can influence the prediction accuracy significantly (e.g. I have seen predictions for the amount of bytes survived to be in the ~2^63 range due to overflow, which are then propagated further to result in completely impossible overall time predictions).

This is a day one bug as far as I understand; only in very few cases consumers of the predictions "manually" clamp values already, e.g. the G1Policy::predict_yg_surv_rate() method.

It would be better if the G1Analytics predict_* would do value clamping already.
Comments
URL: https://hg.openjdk.java.net/jdk/jdk/rev/11ff4e485670 User: tschatzl Date: 2019-11-29 09:20:49 +0000
29-11-2019

Check all calls to get_new_prediction(); particularly a few ratios are missing one or both boundary checks for 0.0/1.0.
25-07-2019