I have written several RHadoop programs that work even though they return warnings such as:
Converting to.dfs argument to keyval with a NULL key
when inputting data with to.dfs.
However, some programs fail fatally with no warnings other than
Converting to.dfs argument to keyval with a NULL key
followed by
ERROR streaming.StreamJob: Job not successful. Error: # of failed Map Tasks exceeded allowed limit. FailedCount: 1.
Is the NULL key warning normally associated with failed Map tasks?
I know the standard advice is to look at stderr, but the stderr for the failed job is empty! Zero lines, zero characters.
As far as I know, the
Converting to.dfs argument to keyval with a NULL key
Is an usual warning, and it doesn't make the job fail.
Did you try fetching your data stored with the to.dfs command using a from.dfs command to see if it worked ? If it does, the problem probably lies eslewhere.