Search code examples
javaandroidfileoutputstreamrandomaccessfilebufferedinputstream

Move 100 bytes from end of file to beginning of file with Java (RandomAccessFile is too slow)


I have a need to move the last 100 bytes of a file to the beginning of the file with Java (on android). After doing some research, I have a working solution, but it is way too slow for some of the larger files I'm working with (up to 2GB). I tried using the read() method on the RandomAccessFile object originally and it was far too slow, so after some more digging, I found an alternative method to use a BufferedInputStream but it doesn't appear to increase the performance at all.

I'm thinking there's got to be a simpler, easier, faster way to do this.

Here is the working code I have that is just too slow:

        File file = new File(Environment.getExternalStorageDirectory()+"/sam.dll");
    RandomAccessFile f;
    OutputStream f1;
    try {
        f = new RandomAccessFile(file, "r");
        long size = file.length();
        f.seek(size - 100);
        FileInputStream fis = new FileInputStream(f.getFD());
        BufferedInputStream bis = new BufferedInputStream(fis);
        try {

            f1 = new FileOutputStream(new File((Environment.getExternalStorageDirectory()+"/sam.dl4")));
            for(int i = 0; i < 100; i++) {
                f1.write(bis.read());
            }
            f.seek(0);
            bis = new BufferedInputStream(fis);
            for(int j =0; j < size - 100;j++) {
                f1.write(f.read());
            }
            f.close();
            f1.close();
            bis.close();
            fis.close();
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        }


    } catch (FileNotFoundException e) {
        Log.e("blah",e.toString());
    } catch (IOException e) {
        e.printStackTrace();
    }

any suggestions for ways I can speed this up? Am I going about this entirely the wrong way? I set it up with C# with a FileStream object and it shifts the bytes in seconds (even for 2GB files) but with the method above, it's literally taking hours.

TIA


Solution

  • As already stated bytewise I/O operations drain performance.

    The approriate way would be:

    • allocate a buffer of appropriate size
    • have your InputStream fill the buffer in one or very few I/O operations
    • manipulate the buffer
    • have your OutputStream flush the buffer to disk in one or very few I/O operations

    In Java (only using classes you already used):

    byte[] buf = new byte[4096];
    // lengthRead is the count of bytes read
    int lengthRead = inputStream.read(buf);
    doBufferMagic(buf, lengthRead);
    outputStream.write(buf, 0, lengthRead);