United StatesChange Country, Oracle Worldwide Web Sites Communities I am a... I want to...
Bug ID: JDK-6478546 FileInputStream.read() throws OutOfMemoryError when there is plenty available
JDK-6478546 : FileInputStream.read() throws OutOfMemoryError when there is plenty available

Details
Type:
Bug
Submit Date:
2006-10-05
Status:
Open
Updated Date:
2013-05-02
Project Name:
JDK
Resolved Date:
Component:
core-libs
OS:
windows_xp
Sub-Component:
java.io
CPU:
x86
Priority:
P4
Resolution:
Unresolved
Affected Versions:
6
Targeted Versions:

Related Reports

Sub Tasks

Description
FULL PRODUCT VERSION :
java version "1.5.0_06"
Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_06-b05)
Java HotSpot(TM) Client VM (build 1.5.0_06-b05, mixed mode)

java version "1.5.0_08"
Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_08-b03)
Java HotSpot(TM) Client VM (build 1.5.0_08-b03, mixed mode, sharing)

java version "1.6.0-beta2"
Java(TM) 2 Runtime Environment, Standard Edition (build 1.6.0-beta2-b72)
Java HotSpot(TM) Client VM (build 1.6.0-beta2-b72, mixed mode)

ADDITIONAL OS VERSION INFORMATION :
Windows XP SP2

EXTRA RELEVANT SYSTEM CONFIGURATION :
2GB RAM

A DESCRIPTION OF THE PROBLEM :
Attempting to read a large file in one chunck with FileInputStream.read() with TOO large heap space causes OutOfMemory exception. According to spec, AFIK, even if the read() runs out of space it should just return with as much data as it can, not throw an exception. And in any case there was plenty of memory available, and further more with smaller heap the code runs ok!

STEPS TO FOLLOW TO REPRODUCE THE PROBLEM :
Run the attached code with heap space (-Xmx) 1300M and 1400M. With 1300M it runs ok, with 1400M it throws an exception. The "bigfile" in my test case had a length of 251??503??002 bytes. I can provide the file, but it should not make difference as the programs just tries to read all the bytes.

EXPECTED VERSUS ACTUAL BEHAVIOR :
EXPECTED -
Either the FileInputStream.read() should have read the whole file (as there was 1400 MB of memory available and the file size was 'only' 250MB) or it should have read some smaller part of the file and returned the number of bytes read. It should not have thrown an exception.
ACTUAL -
With TOO large heap size the code throws OutOfMemory exception.

ERROR MESSAGES/STACK TRACES THAT OCCUR :

C:\tests>"C:\Program Files\Java\jre1.6.0\bin\java.exe" -Xmx1400m -classpath . r3
d
size: 251503002
buf ok
Exception in thread "main" java.lang.OutOfMemoryError
        at java.io.FileInputStream.readBytes(Native Method)
        at java.io.FileInputStream.read(Unknown Source)
        at r3d.main(r3d.java:19)

REPRODUCIBILITY :
This bug can be reproduced always.

---------- BEGIN SOURCE ----------
import java.awt.*;
import java.io.*;

public class r3d {

    public static void main(String[] args) {
	try {
		int size = 501*501*501*2;

		FileInputStream fis = new FileInputStream("bigfile"); // Any file with size >= 501*501*501*2

		System.out.println("size: " + size);

		byte buf[] = new byte[size];

		System.out.println("buf ok");


		int bytesRead = fis.read(buf, 0,size);
		System.out.println("Bytes read " + bytesRead);
	}
	catch (Exception e) {
		e.printStackTrace();
		System.out.println(e);
		System.out.println();
	}
    }
}

---------- END SOURCE ----------

CUSTOMER SUBMITTED WORKAROUND :
Attempting to read the file in multiple smaller chunks seem to work,.

                                    

Comments
EVALUATION

A malloc'ed buffer is used for I/O when byte array is larger than 8k. We should consider limiting the size to something reasonable (say 1MB). Additioning, this temporary buffer could be a thread local rather than malloc'ed buffer to reduce the cost of malloc/free per I/O.
                                     
2009-08-17



Hardware and Software, Engineered to Work Together