Search code examples
javahadoopjava-native-interfacegpu

Error for Hadoop running java with JNI


I try to run a Java program which utilize jni to call GPU program in Hadoop 2.3.0, but I got the following error:

java.lang.Exception: java.lang.UnsatisfiedLinkError: affy.qualityControl.PLM.wlsAcc([D[D[DII)V
    at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.lang.UnsatisfiedLinkError: affy.qualityControl.PLM.wlsAcc([D[D[DII)V
    at affy.qualityControl.PLM.wlsAcc(Native Method)
    at affy.qualityControl.PLM.rlm_fit_anova(PLM.java:141)
    at affy.qualityControl.PLM.PLMsummarize(PLM.java:31)
    at affy.qualityControl.SummarizePLMReducer.reduce(SummarizePLMReducer.java:59)
    at affy.qualityControl.SummarizePLMReducer.reduce(SummarizePLMReducer.java:12)
    at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171)
    at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627)
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
    at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

I guess the error is caused on JNI. I write a small test Java code to call my GPU code (wlsAcc) via JNI, it works fine. I also ldd my GPU shared library, every libraries are linked. I also add the following code in my MapReduce code (my GPU code is called in Reducer):

    setInputParameters(conf, args);
    DistributedCache.createSymlink(conf);
    DistributedCache.addCacheFile(new URI("/user/sniu/libjniWrapper.so#libjniWrapper.so"), conf);
    conf.set("mapred.reduce.child.java.opts", "-Djava.library.path=.");

also I copy libjniWrapper.so to the HDFS at /user/sniu/ dir. I still not figure why hadoop cannot find my native shared library. Is anyone know where is my problem?


Solution

  • Now problem is fixed, the problem is that for native C code, originally I wrote it like this:

    JNIEXPORT void JNICALL Java_jniWrapper_wlsAcc
    

    instead, the correct way should be:

    JNIEXPORT void JNICALL Java_affy_qualityControl_jniWrapper_wlsAcc