I'm new in Hadoop so exuce me if the question is stupid. I have a local single-node cluster. I'm trying to execute a simple MapReduce job in RHadoop and I get this message:
> wordcount('/data/complete_works_of_shakespeare.txt')
Error creating temp dir in hadoop.tmp.dir /app/hadoop/tmp due to Permission denied
Show Traceback
Rerun with Debug
Error in mr(map = map, reduce = reduce, combine = combine, vectorized.reduce, :
hadoop streaming failed with error code 255 Also warnings:
1: In rmr.options("backend") :
Please set an HDFS temp directory with rmr.options(hdfs.tempdir = ...)
2: In rmr.options("hdfs.tempdir") :
Please set an HDFS temp directory with rmr.options(hdfs.tempdir = ...)
3: In rmr.options("backend") :
Please set an HDFS temp directory with rmr.options(hdfs.tempdir = ...)
4: In rmr.options("backend.parameters") :
Please set an HDFS temp directory with rmr.options(hdfs.tempdir = ...)
What should I do to set all this options? How should the path to hdfs.tempdir look like? I'd want to set the temp directory to (if I understand right what R wants from me)):
hduser@nina:~$ hadoop fs -ls /
14/08/10 12:52:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 6 items
...
drwxr-xr-x - hduser supergroup 0 2014-08-10 00:04 /temp
Btw I've already tried
rmr.options(backend="hadoop")
NULL
Warning:
In rmr.options(backend = "hadoop") :
Please set an HDFS temp directory with rmr.options(hdfs.tempdir = ...)
Please help. Many thanks.
UPD:
Also tried:
hduser@nina:~$ sudo chmod -R 777 '/app/hadoop'
hduser@nina:~$ sudo chmod -R 777 '/tmp'
Doesn't help.
Verify if you have permissions on the app/hadoop directory to create a folder. Try chown on the directory.For example:
sudo chown <userid> <directory_path>