I have several files in a unix directory that I have to move to Hadoop. I know the copyFromLocal command:
Usage: hadoop fs -copyFromLocal URI but that allows me to move one by one.
Is there any way to move all those files to the HDFS in one command?
I want to know if there is a way to transfer several files at once
put command will work
if you want to copy whole directory from local to hdfs
hadoop fs -put /path1/file1 /pathx/target/
if you want to copy all files from directory to hdfs in one go
hadoop fs -put /path1/file1/* /pathx/target/