Search code examples
imageapihttpuploadingspark-java

Server gives 500 code error but still serves content


I'm serving "serveraddress/img/:imageid" with this function:

 public static String getImage(Request req, Response res) {
        String imageId = req.params("imageid");
        byte[] bytes = readImage("img/" + imageId);

        return Base64.getMimeEncoder().encodeToString(bytes); 
    }

 public static byte[] readImage(String image) {
        byte[] bytes;
        try {
            File fff = new File(image);
            FileInputStream fileInputStream = new FileInputStream(fff);
            long byteLength = fff.length();

            bytes = new byte[(int) byteLength];
            fileInputStream.read(bytes, 0, (int) byteLength);
        } catch (Exception ex) {
            ex.printStackTrace();
            bytes = null;
        }
        return bytes;
    }

For the first couple of requests, it delivers normally. But when a couple more, like 5 or 6 requests ask for images it starts sending out server error code 500 without any stack trace. I'm not sure what the issue is, but I hope someone can help me figure out.

Here is my gradle.build file for anyone who wants to test it out:

testCompile group: 'junit', name: 'junit', version: '4.12'
compile group: 'org.slf4j', name: 'slf4j-log4j12', version: '1.7.21'
compile "com.sparkjava:spark-core:2.7.1"

also this is the main function:

public static void main(String[] args) {
        BasicConfigurator.configure();
        org.apache.log4j.Logger.getRootLogger().setLevel(Level.ERROR);
        port(8883);
        get("/img/:imageid", this::getImage);
}

Solution

  • It looks like your code never closes the File nor the InputStream that it uses, so you're leaking file descriptors and memory. You should use a try-with-resources or close them in a finally block (and on a side note, maybe you could use a linter to warn you about unclosed resources).