I'm creating a library that consists of a Log4J appender that asynchronously sends events to a remote server. When a log statement is made, the appender will asynchronously record the event into a local queue which a pool of consumers will then retrieve and send to the remote.
The completely in-memory solution would be to create a BlockingQueue which would handle the concurrency issue. However, I'd like for the queue to be persisted so that if the remote server is not available I don't grow the queue unbounded or start to discard messages in the case of a bounded queue.
I was thinking of using an embedded H2 database to store the events locally and then use a polling mechanism to retrieve events and send to the remote. I would much rather use a BlockingQueue than to poll a database table.
Is JMS the answer?
EDIT:
If JMS is the answer, and it seems to be going that way, does anyone have recommendations on a lightweight, embeddable JMS solution that can be configured to only accept messages in-process? In other words, I do not want to, and possibly will not be allowed to, open up a TCP socket on which to listen.
EDIT:
I've got ActiveMQ embedded now and it seems to be working. Thanks all.
You could use JMS to asynchronously send messages to a remote machine (assuming it can receive them of course), Log4j has a JMS Appender you can use for this.