I have java 6 embedded HttpServer. It has a handle which allows clients to download a big text file. The problem is that whenthe server has more then 10 simultaneous clients, i get out of memory exception. I'm prety sure that the problem is around the Http Server.
HttpServer m_server = HttpServer.create(new InetSocketAddress(8080), 0);
m_server.createContext("/DownloadFile", new DownloadFileHandler() );
public class DownloadFileHandler implements HttpHandler {
private static byte[] myFile = new String("....................").getBytes(); //string about 8M
@Override
public void handle(HttpExchange exchange) throws IOException {
exchange.sendResponseHeaders(HTTP_OK, myFile .length); OutputStream responseBody = exchange.getResponseBody();
responseBody.write(myFile );
responseBody.close();
}
}
Now the exception i get is:
java.lang.OutOfMemoryError: Java heap space
at java.nio.HeapByteBuffer.<init>(Unknown Source)
at java.nio.ByteBuffer.allocate(Unknown Source)
at sun.net.httpserver.Request$WriteStream.write(Unknown Source)
at sun.net.httpserver.FixedLengthOutputStream.write(Unknown Source)
at java.io.FilterOutputStream.write(Unknown Source)
at sun.net.httpserver.PlaceholderOutputStream.write(Unknown Source)
at com.shunra.javadestination.webservices.DownloadFileHandler.handle(Unknown Source)
at com.sun.net.httpserver.Filter$Chain.doFilter(Unknown Source)
at sun.net.httpserver.AuthFilter.doFilter(Unknown Source)
at com.sun.net.httpserver.Filter$Chain.doFilter(Unknown Source)
at sun.net.httpserver.ServerImpl$Exchange$LinkHandler.handle(Unknown Source)
at com.sun.net.httpserver.Filter$Chain.doFilter(Unknown Source)
at sun.net.httpserver.ServerImpl$Exchange.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Exception in thread "pool-1-thread-24" java.lang.OutOfMemoryError:
The suggestion regarding the getBytes() doesn't change the exception. i have tried to hold a static reference to byte[] instead of creating it each time. And I still get the same exception.
Do not do that for large files:
byte[] bytesToSend = myFile.getBytes();
This is inefficient and you need heap space for storing the whole file data. You're wasting lots of heap space when you first read the file completly and afterwards write it completly.
Instead read/write the file data in chunks of specific size from file directly to the response. You can write code on your own or just use a utility class like IOUtils
from Apache Commons IO.
It is important to not read the whole file first before you write it. Instead do it in smaller chunks. Use streams here and avoid anything that deals with byte[] except for buffering and the small chunks.
Edit: Here's some code with Apache IO...
public static void main(String[] args) {
HttpExchange exchange = ...;
OutputStream responseBody = null;
try {
File file = new File("big-file.txt");
long bytesToSkip = 4711; //detemine how many bytes to skip
exchange.sendResponseHeaders(200, file.length() - bytesToSkip);
responseBody = exchange.getResponseBody();
skipAndCopy(file, responseBody, bytesToSkip);
}
catch (IOException e) {
// handle it
}
finally {
IOUtils.closeQuietly(responseBody);
}
}
private static void skipAndCopy(File src, @WillNotClose OutputStream dest, long bytesToSkip) throws IOException {
InputStream in = null;
try {
in = FileUtils.openInputStream(src);
IOUtils.skip(in, bytesToSkip);
IOUtils.copyLarge(in, dest);
}
finally {
IOUtils.closeQuietly(in);
}
}