I get this error below when I try to format my HDFS by runing this command:$HADOOP_HOME/bin/hdfs namenode -format
in my Ubuntu 18.4 machine:
/home/mohamedamine/Downloads/hadoopWork/hadoop/bin/hdfs: line 304: /home/mohamedamine/Downloads/hadoopWork/jdk1.8.0_101/jre/bin/java: No such file or directory
I'm using this documentation to install hadoop.
I googled for the same problem and I find all answers talking about checking the java path. I check all java path and I'm sure that all correct. Below my ./bashrc content
#Set HADOOP_HOME
export HADOOP_HOME=/home/mohamedamine/Downloads/hadoopWork/hadoop
#Set JAVA_HOME
export JAVA_HOME=/home/mohamedamine/Downloads/hadoopWork/jdk1.8.0_101
# Add bin/ directory of Hadoop to PATH
export PATH=$PATH:$HADOOP_HOME/bin
Below also my java path in the hadoop-env.sh
# The java implementation to use.
export JAVA_HOME=/home/mohamedamine/Downloads/hadoopWork/jdk1.8.0_101
I try also this path
# The java implementation to use.
export JAVA_HOME=/home/mohamedamine/Downloads/hadoopWork/jdk1.8.0_101/jre
But always the same error. I'm using hadoop 2.7.3 version.
If you could show me how to resolve this error it would be greatly appreciated.Thanks a lot
It's just a problem include in the version of jdk I don't know maybe oracle delete some file from new version. So it's work when I switch from jdk 1.8.0_101 to jdk 1.8.0_05. Below the link of old versions of jdk from oracle archive: Java archive