Search code examples
apache-sparkmesosspark-jobserver

java.lang.UnsatisfiedLinkError: no mesos in java.library.path


I am trying to configure spark job-sever for Mesos cluster deployment mode. I have set spark.master = "mesos://mesos-master:5050" in jobserver config.

When I am trying to create a context on job-server, it is failing with the following exception:

[2017-04-19 14:09:42,346] ERROR .jobserver.JobManagerActor [] [akka://JobServer/user/jobManager-42-881e-b37be6e443dd] - Failed to create context test-context, shutting down actor
java.lang.UnsatisfiedLinkError: no mesos in java.library.path
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
    at java.lang.Runtime.loadLibrary0(Runtime.java:870)
    at java.lang.System.loadLibrary(System.java:1122)
    at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:54)
    at org.apache.mesos.MesosNativeLibrary.load(MesosNativeLibrary.java:79)
    at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2485)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:492)
    at spark.jobserver.context.DefaultSparkContextFactory$$anon$1.<init>(SparkContextFactory.scala:119)
    at spark.jobserver.context.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:119)
    at spark.jobserver.context.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:114)
    at spark.jobserver.context.SparkContextFactory$class.makeContext(SparkContextFactory.scala:63)
    at spark.jobserver.context.DefaultSparkContextFactory.makeContext(SparkContextFactory.scala:114)
    at spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:135)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at spark.jobserver.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at spark.jobserver.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:25)
    at spark.jobserver.common.akka.Slf4jLogging$class.spark$jobserver$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:34)
    at spark.jobserver.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:24)
    at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
    at spark.jobserver.common.akka.ActorMetrics$$anonfun$receive$1.applyOrElse(ActorMetrics.scala:23)
    at akka.actor.Actor$class.aroundReceive(Actor.scala:484)
    at spark.jobserver.common.akka.InstrumentedActor.aroundReceive(InstrumentedActor.scala:8)
    at akka.actor.ActorCell.receiveMessage(ActorCell.scala:526)
    at akka.actor.ActorCell.invoke(ActorCell.scala:495)
    at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:257)
    at akka.dispatch.Mailbox.run(Mailbox.scala:224)
    at akka.dispatch.Mailbox.exec(Mailbox.scala:234)
    at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

I have set environment variable MESOS_NATIVE_JAVA_LIBRARY pointing to correct location of libmesos.so Also, I am able to submit job successfully using spark submit from command line:

 ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master mesos://mesos-master:5050 /pathto/spark/examples.jar 100

That means my mesos cluster setup is working.

Am I missing any configuration that needs to be done specifically for spark-jobserver?


Solution

  • I was setting MESOS_NATIVE_JAVA_LIBRARY env variable for user and I was running job-server with Sudo privileges.