I have simple ServletContextListener
that runs a process when tomcat starts and shuts it down when tomcat shuts down.
Process:
java -jar application.war -S rake jobs:work
It spawns a ruby delayed_job
worker that processes queues. However problem arises when there are lots of jobs to process or a job takes a while. It processes a few and then It just stops. No errors are thrown, nothing in the log. It just halts execution.
When I restart the server, an entry gets put in the log that shutdown signal was sent. Worker wakes up, finishes a job (if it was paused during execution) and exits.
When I run that command outside of tomcat7, it works fine as expected.
ServletContextListener
code:
public class RakeServlet implements ServletContextListener
{
private Process workerProcess;
@Override
public void contextInitialized(ServletContextEvent event)
{
workerProcess = Runtime.getRuntime().exec("java -jar application.war -S rake jobs:work");
}
@Override
public void contextDestroyed(ServletContextEvent event)
{
workerProcess.destroy();
}
}
ps aux
output:
tomcat7 2119 0.5 16.9 3479300 668704 ? Sl 02:20 2:41 java -jar application.war -S rake jobs:wor
The problem is that when running an external process you must read both the stdout
and stderr
streams from the child process. Otherwise, when the output buffers fill up, the process effectively gets paused on an I/O write block.
In order to do this properly, you'll need to start some threads to monitor those streams and empty them (presumably, doing something useful with the output and not simply discarding it).
For more details, have a look at Steve Liles's blog post on the subject where he describes the problem and examples for the solution.
If the target process will actually be another Java process, you ought to consider running the Java code in-process to avoid the overhead of launching a second JVM and the complexities of communicating with it (i.e. the stdio streams).