I have a log file which is constantly updated with new lines of data. I need to get new added data in java as soon as it's written. For now my solution is:
public static void readNonStop(String filename, boolean goToEnd, FileReadCallback readCallback) {
if(readCallback == null) {
return;
}
try {
BufferedReader br = new BufferedReader(new FileReader(filename));
try {
String line = br.readLine();
int lineNumber = 0;
if(goToEnd) {
while(br.readLine() != null) {}
}
while (true) {
if(line != null) {
readCallback.onRead(lineNumber++, line);
} else {
Thread.sleep(1);
}
line = br.readLine();
}
} finally {
br.close();
}
} catch (Exception e) {
e.printStackTrace();
}
}
But I have a feeling, that there should be a better way. I don't like the idea of a constant runnin loop with a "sleep" inside and would prefer some sort of an event driven approach.
If I rely on FileSystem events to re-open the file each time it is modified, it itroduces a delay.
What is the correct way of doing it for this situation?
Thanks in advance!
Files are not designed to be a messaging solution. Even using TCP over loopback can have a delay of 10 - 30 microseconds. Without changing the file format, your solution is likely to be the fastest.
NOTE: you don't have to sleep for a whole millisecond. You can use Thread.yield()
or LockSupport.parkNanos(100_000);
For a more complex strategy, you can have a class like LongPauser which backs off in a configurable way.
BTW I implemented a solution to write/read files in a low latency way called Chronicle Queue. This has sub-microsecond latencies using a binary format for speed.
NOTE: You can just to the end by skipping all the bytes available()
when you open the file as a FileInputStream. This might result in an incomplete line depending on how your buffering works.