For my Spark API I'm building integration tests. Sometimes I want to stop and start the Spark instance. When I do that I sometimes run into the problem that I'm creating a new Spark instance, while the old one is still shutting down on a separate thread. It would be helpful to know when the Spark instance actually shut down.
First I start my Spark instance like this:
Spark.init();
Spark.awaitInitialization();
Then I stop it like this:
Spark.stop();
Now after I call stop()
, the Spark Service hasn't actually stopped!
Is there a similar functionality to awaitInitialization()
or another way of knowing when the Spark service actually stopped?
Spark 2.8.0 introduced an awaitStop()
method: https://github.com/perwendel/spark/pull/730
If you are stuck at a version below (e.g. using spark-kotlin which uses Spark 2.6.0), you could use some reflection to identify the current state of Spark:
fun awaitShutdown() {
Spark.stop()
while (isSparkInitialized()) {
Thread.sleep(100)
}
}
/**
* Access the internals of Spark to check if the "initialized" flag is already set to false.
*/
private fun isSparkInitialized(): Boolean {
val sparkClass = Spark::class.java
val getInstanceMethod = sparkClass.getDeclaredMethod("getInstance")
getInstanceMethod.isAccessible = true
val service = getInstanceMethod.invoke(null) as Service
val serviceClass = service::class.java
val initializedField = serviceClass.getDeclaredField("initialized")
initializedField.isAccessible = true
val initialized = initializedField.getBoolean(service)
return initialized
}