I configured and ran examples with mahout but I get the following error :
hadoop binary is not in PATH,HADOOP_HOME/bin,HADOOP_PREFIX/bin, running locally Error occurred during initialization of VM
In my, .bashrc
, I defined the following aliases :
JAVA_HOME,HADOOP_CONF_DIR,MAHOUT_CONF_DIR,HADOOP_HOME
I already have configured this in /etc/bash.bashrc
:
export JAVA_HOME=/usr/lib/jvm/java-6-openjdk-i386
export PATH=$PATH:$JAVA_HOME/bin
export HADOOP_HOME=/home/user/hadoop-0.20.2
export PATH=$PATH:$HADOOP_HOME/bin
export MAHOUT_HOME=/home/user/mahout/trunk
export classpath=$classpath:$MAHOUT_HOME/src/conf
export HADOOP_CONF_DIR=/home/user/hadoop-0.20.2/conf
export classpath=$classpath:$HADOOP_CONF_DIR
However, I have the following error :
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath. Error: Could not find or load main class classpath MAHOUT_LOCAL is set, running locally Error occurred during initialization of VM Could not reserve enough space for object heap Could not create the Java virtual machine.
This is nothing to do with Mahout or Hadoop, but your shell. You appear to be using HADOOP_HOME
instead of $HADOOP_HOME
in your path expression somewhere, for example.