Is it possible to memory-map huge files (multiple GBs) in Java?
This method of FileChannel
looks promising:
MappedByteBuffer map(FileChannel.MapMode mode, long position, long size)
Both position
and size
allow for 64-bit values -- so far, so good.
MappedByteBuffer
, however, only provides methods for 32-bit positions (get(int index)
, position(int newPosition)
, etc.), which seems to imply that I cannot map files larger than 2 GB.
How can I get around this limitation?
Take a look at Using a memory mapped file for a huge matrix code which shows how to create a list of MappedByteBuffer
, each smaller then 2 GB, to map the entire file:
private static final int MAPPING_SIZE = 1 << 30;
...
long size = 8L * width * height;
for (long offset = 0; offset < size; offset += MAPPING_SIZE) {
long size2 = Math.min(size - offset, MAPPING_SIZE);
mappings.add(raf.getChannel().map(FileChannel.MapMode.READ_WRITE, offset, size2));
}
As per JDK-6347833 (fs) Enhance MappedByteBuffer to support sizes >2GB on 64 bit platforms the reason for the 2 GB limit is:
A MappedByteBuffer is a ByteBuffer with additional operations to support memory-mapped file regions. To support mapping a region larger than Integer.MAX_VALUE would require a parallel hierarchy of classes. For now the only solution is create multiple MappedByteBuffers where each corresponds to a region that is no larger than 2GB.