I got this error when I'm trying to execute the this command: $bin/hadoop namenode –format
/home/MAHI/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 31: unexpected EOF while looking for matching `"'
/home/MAHI/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 58: syntax error: unexpected end of file**
# The java implementation to use. Required.
export JAVA_HOME= "C:\Java\"
# Extra Java CLASSPATH elements. Optional.
# export HADOOP_CLASSPATH=
# The maximum amount of heap to use, in MB. Default is 1000.
# export HADOOP_HEAPSIZE=2000
# Extra Java runtime options. Empty by default.
# export HADOOP_OPTS=-server
# Command specific options appended to HADOOP_OPTS when specified
export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_NAMENODE_OPTS"
export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_SECONDARYNAMENODE_OPTS"
export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_DATANODE_OPTS"
export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote $HADOOP_BALANCER_OPTS"
export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote $HADOOP_JOBTRACKER_OPTS"
export HADOOP_TASKTRACKER_OPTS=
The following applies to multiple commands (fs, dfs, fsck, distcp etc)
export HADOOP_CLIENT_OPTS
Extra ssh options. Empty by default.
export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"
Where log files are stored. $HADOOP_HOME/logs by default.
export HADOOP_LOG_DIR=${HADOOP_HOME}/logs
File naming remote slave hosts. $HADOOP_HOME/conf/slaves by default.
export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves
host:path where hadoop code should be rsync'd from. Unset by default.
export HADOOP_MASTER=master:/home/$USER/src/hadoop
Seconds to sleep between slave commands. Unset by default. This
can be useful in large clusters, where, e.g., slave rsyncs can
otherwise arrive faster than the master can service them.
export HADOOP_SLAVE_SLEEP=0.1
The directory where pid files are stored. /tmp
by default.
NOTE: this should be set to a directory that can only be written to by
the users that are going to run the hadoop daemons. Otherwise there is
the potential for a symlink attack.
export HADOOP_PID_DIR=/var/hadoop/pids
A string representing this instance of hadoop. $USER by default.
export HADOOP_IDENT_STRING=$USER
The scheduling priority for daemon processes. See 'man nice'.
export HADOOP_NICENESS=10
You have set JAVA_HOME incorrectly in hadoop-env.sh. Give absolute path of java_home.You can find out java current java path using below command :
alternatives --config java
It will give all the java version you have installed and pick proper version and set this java path into hadoop-env.sh like below :
export JAVA_HOME=/usr/java/jdk1.*/bin
Another approach is to set $JAVA_HOME into user's .bashrc . So no need to required to set into hadoop-env.sh.