I have a RandomAccessFile
fileA
, of size 4*k
which has been made by using something like the DataOutputStream.writeInt()
method.
I wish to read it fully into an array int[k]
.
What is the fastest method to do it?
I have considered DataInputStream
around a BufferedInputStream
and also using readFully()
and then shifting those bits manually, but I am unsure how fast the second one would actually be, given the overhead of reading it an extra time. I have yet to look into java.nio
though.
TL;DR: Fast(est) way to fully read a file of integers into an int[]
?
Edit: I am running this on a remote machine where I only have access to the JVM memory, without the ability to memory map files.
Upon running a bunch of benchmarks on different sized files 50 times in batches of 1000, I have concluded the performance difference between method A and method B (below) are insignificant, with differences of less than 1% in times. Further methods have yet to be tested, and I would appreciate someone else confirming or denying the validity of my results.
Method A:
for (int k = 0; k < fileIntLength; k++){
ints[k] = read.readInt();
}
Method B:
byte[] temp = new byte[fileIntLength*4];
read.readFully();
if (temp.length > 0){
ints = ByteBuffer.wrap(temp).asIntBuffer().array();
}
Edit: Noted was a speed increase as Java compiled parts to machine code. But the speed increase was almost the same in each case. 3000 arbitrary units of time in each case in the first few cases, decreasing to 2800 as java optimised.