JDK-8238842 : AIOOBE in GIFImageReader.initializeStringTable
Type:Bug
Component:client-libs
Sub-Component:javax.imageio
Affected Version:8u241,11,13,14,15
Priority:P3
Status:Resolved
Resolution:Fixed
Submitted:2020-02-11
Updated:2020-07-28
Resolved:2020-02-12
The Version table provides details related to the release that this issue/RFE will be addressed.
Unresolved : Release in which this issue/RFE will be addressed. Resolved: Release in which this issue/RFE has been resolved. Fixed : Release in which this issue/RFE has been fixed. The release containing this fix may be available for download as an Early Access Release or a General Availability Release.
The problem here is that "initCodeSize" is read and used without validation.
The string table is initially populated up to (2^initCodeSize)-1. The table has 4096 entries
and the gif here has initCodeSize set to 18 (decimal) but a max of 12 (2^12=4096) can be specified.
However I don't think checking for > 12 is completely correct, even though it would fix the problem.
If you look at the writer the value it writes out is bits-per-pixel which for GIF is a maximum of 8.
And based on the LZW algorithm what you want here is the set of roots which has to then max
out at 2^8 (256) in GIF.
Looking at giflib code that is what it checks for.
I've also looked at the AWT Toolkit GIF decoder code. It checks for >=12 (not >12).
I guess that didn't matter since 8 is the real limit.
Since GIF specifies 12 as the maximum - and it will grow code sizes dynamically up to that size
as it reads - if you were to fill all 4096 entries, you'd have no compression at all as there'd be
no space for additional sequences.