i want to copy a local file in Hadoop FS. i run this command:
sara@ubuntu:/usr/lib/hadoop/hadoop-2.3.0/bin$ hadoop fs -copyFromLocal /home/sara/Downloads/CA-GrQc.txt /usr/lib/hadoop/hadoop-2.3.0/${HADOOP_HOME}/hdfs/namenode
and
sara@ubuntu:/usr/lib/hadoop/hadoop-2.3.0/bin$ hdfs dfs -copyFromLocal /home/sara/Downloads/CA-GrQc.txt /usr/lib/hadoop/hadoop-2.3.0/${HADOOP_HOME}/hdfs/namenode
and even if i run : hdfs dfs -ls
i get this error:
> WARN util.NativeCodeLoader: Unable to load native-hadoop library for
> your platform... using builtin-java classes where applicable
> copyFromLocal: `.': No such file or directory
i don't know why i get this error? Any idea please?
According to your input your Hadoop installation seems to be working fine. What is wrong, it that hadoop fs -copyFromLocal
expect the directory HDFS directory as target directory, but not the local directory where the Hadoop stores its blocks.
So in you case the command should look like(for example):
sara@ubuntu:/usr/lib/hadoop/hadoop-2.3.0/bin$ hdfs dfs -copyFromLocal /home/sara/Downloads/CA-GrQc.txt /sampleDir/
Where the sampleDir
is the directory you create with hadoop fs -mkdir
command.