Search code examples
c++compilationhadoopdynamic-linkingstatic-linking

Hadoop Pipes cannot find shared libraries


I am getting this error while running a hadoop pipes program. The program compiles successfully but fails on hadoop pipes.

error while loading shared libraries: Lib.so.0: cannot open shared object file: No such file or directory

Makefile:

CC = g++
HADOOP_PATH = usr/lib/HADOOP
OTHERLIB1_PATH = usr/lib/OTHERLIB1
OTHERLIB2_PATH = usr/lib/OTHERLIB2
OTHERLIB3_PATH = usr/lib/OTHERLIB3
OTHERLIB4_PATH = usr/lib/OTHERLIB4
IMAGE_PATH = usr/lib/IMAGE
LIB_PATH = ../../../src/Lib
PLATFORM = Linux-amd64-64

CFLAGS_HDP =  -O3 \
        -I$(LIB_PATH) \
        -I$(OTHERLIB1_PATH)/include \
        -I$(HADOOP_PATH)/$(PLATFORM)/include \
        -I$(OTHERLIB4_PATH)/include \
        -I$(OTHERLIB2_PATH)/include \
        -I$(OTHERLIB3_PATH)/include 

LDFLAGS_HDP =   -L$(OTHERLIB1_PATH)/lib \
        -L$(HADOOP_PATH)/$(PLATFORM)/lib \
        -L$(OTHERLIB3_PATH)/lib \
        -L$(OTHERLIB2_PATH)/lib \
        -L$(OTHERLIB4_PATH)/lib \
        -L$(LIB_PATH)/.libs \
        -lhadooppipes -lhadooputils -lpthread -lcrypto\
        -lLib -lLib4 -lLib1

all: pipes clean 

clean:
        rm  *.o

pipes: LibPipes.cpp xml DocToXml
    $(CC) $(CFLAGS_HDP) \
    LibPipes.cpp \
    -o Lib_Pipes base64.o \
    xml.o DocToXml.o $(LDFLAGS_HDP)


xml: xml.cpp base64
        $(CC) $(CFLAGS_HDP) base64.o -c xml.cpp -o xml.o

base64: base64.cpp
        $(CC) $(CFLAGS_HDP) -c base64.cpp -o base64.o

DocToXml: DocToXml.cpp
        $(CC) $(CFLAGS_HDP) -c DocToXml.cpp -o  DocToXml.o

I run the program on hadoop using the following command:

hadoop pipes \
-D hadoop.pipes.java.recordreader=false \
-D hadoop.pipes.java.recordwriter=false \
-D mapred.map.tasks=128 \
-inputformat org.apache.hadoop.mapred.SequenceFileInputFormat \
-writer org.apache.hadoop.mapred.SequenceFileOutputFormat \
-reduce org.apache.hadoop.mapred.lib.IdentityReducer \
-input Input \
-output Output \
-program /user/uss/bin/Lib_Pipes \
-reduces 1

This seems to be a problem caused because of dynamic linking. I have tried giving the libraries to hadoop using the -files flag. Also trying to link this program statically using different compilation flags like -static, -Bstatic, -static-libgcc, -static-libstdc++, but these also don't work. Does anyone know how to handle this type of binaries on hadoop pipes? Any help would be appreciated.


Solution

  • Figured out the problem. It was a space after the , in the -files flag.