I am running a hadoop mapreduce job using a Python program that creates different input paths as parameters for the mapreduce job. I am currently checking for hadoop fs path existence, before I pass these input paths into the mapreduce, using the command:
hadoop fs -test -e 'filename'
My Python program then communicates with the command line and determines if the file exists (the -test returns 0 when the file exists, an integer greater than 1 otherwise). Since the Python program is checking for path existence and outputting all of the nonexistent paths to a separate .txt document, I do not need to know which paths do not exist as command line warnings.
I would like to know how to suppress (or ignore) the automatic hadoop fs output:
test: 'fileName': No such file or directory
as I am inputting a huge number of paths and quite a few of them do not exist in hadoop fs.
Redirect the error/warning to /dev/null/
hdfs dfs -test -e I/dont/exist 2>/dev/null