Search code examples
amazon-web-serviceshadoop-streamingelastic-map-reduce

Adding extra arguements to HadoopJarStepConfig fails


I am trying to get this command via the AWS SDK:

hadoop jar /home/hadoop/contrib/streaming/hadoop-streaming.jar -input hdfs:///logs/ -output hdfs:///no_dups -mapper dedup_mapper.py -reducer dedup_reducer.py -file deduplication.py dedup_mapper.py dedup_reducer.py timber.py signature_v4.py

My java code is:

HadoopJarStepConfig config = new StreamingStep()
        .withInputs("hdfs:///logs")
        .withOutput("hdfs:///no_dups")
        .withMapper("dedup_mapper.py")
        .withReducer("dedup_reducer.py")
        .toHadoopJarStepConfig();

Collection<String> aggs = config.getArgs();
aggs.add("-file deduplication.py timber.py dedup_mapper.py dedup_reducer.py signature_v4.py");
config.setArgs(aggs);

Which produces the following AddJobFlowStepsRequest (when toString() is called):

{JobFlowId: j-3TDECOMCOO8HE, Steps: [{Name: DeDup, ActionOnFailure: CONTINUE, HadoopJarStep: {Properties: [], Jar: /home/hadoop/contrib/streaming/hadoop-streaming.jar, Args: [-input, hdfs:///logs, -output, hdfs:///no_dups, -mapper, dedup_mapper.py, -reducer, dedup_reducer.py, -file deduplication.py timber.py dedup_mapper.py dedup_reducer.py signature_v4.py], }, }], }

And finally, the error I am seeing on the Master Node:

2013-04-26 16:43:48,116 ERROR org.apache.hadoop.streaming.StreamJob (main): Unrecognized option: -file deduplication.py timber.py dedup_mapper.py dedup_reducer.py signature_v4.p

The strange thing is that the error logs list the available options, and –file is amongst them. Has anyone else seen this issue?

More logs:

2013-04-26T16:43:46.105Z INFO Fetching jar file.

2013-04-26T16:43:47.609Z INFO Working dir /mnt/var/lib/hadoop/steps/9

2013-04-26T16:43:47.609Z INFO Executing /usr/lib/jvm/java-6-sun/bin/java -cp /home/hadoop/conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/hadoop:/home/hadoop/hadoop-core-1.0.3.jar:/home/hadoop/hadoop-tools.jar:/home/hadoop/hadoop-tools-1.0.3.jar:/home/hadoop/hadoop-core.jar:/home/hadoop/lib/*:/home/hadoop/lib/jetty-ext/* -Xmx1000m -Dhadoop.log.dir=/mnt/var/log/hadoop/steps/9 -Dhadoop.log.file=syslog -Dhadoop.home.dir=/home/hadoop -Dhadoop.id.str=hadoop -Dhadoop.root.logger=INFO,DRFA -Djava.io.tmpdir=/mnt/var/lib/hadoop/steps/9/tmp -Djava.library.path=/home/hadoop/native/Linux-amd64-64 org.apache.hadoop.util.RunJar /home/hadoop/contrib/streaming/hadoop-streaming.jar -input hdfs:///logs -output hdfs:///no_dups -mapper dedup_mapper.py -reducer dedup_reducer.py -file deduplication.py timber.py dedup_mapper.py dedup_reducer.py signature_v4.py

2013-04-26T16:43:48.611Z INFO Execution ended with ret val 1

2013-04-26T16:43:48.612Z WARN Step failed with bad retval

Solution

  • The reason the error is appearing is because the entire command is being interpreted as a single command option.

    The solution is to add the command option, and then the arguments like so:

    args.add("-file");
    args.add("myfile.txt");
    

    If you want to add multiple files, then you do it like this:

    args.add("-file");
    args.add("myfile.txt");
    args.add("-file");
    args.add("myfile2.txt");
    

    If you just give the files as a list in one argument, then the entire line will be interpreted as the file name and an error will probably be thrown.