Search code examples
hadoophdfsbigdataflume

Transferring files from remote node to HDFS with Flume


I have a bunch of binary files compressed into *gz format. These are generated on a remote node and must be transferred to HDFS located one of the datacenter's server.

I'm exploring the option of sending the files with Flume; I explore the option of doing this with a Spooling Directory configuration, but apparently this only works when the file's directory is located locally on the same HDFS node.

Any suggestions how to tackle this problem?


Solution

  • There is no out-of-box solution for such case. But you could try these workarounds:

    1. You could create your own source implementation for such purpose (by using Flume SDK). For example, this project seems to be able to connect to remote dir by ssh and use it as source.
    2. You could create a custom scheduled script to copy remote files into local spool directory periodically, then use it as a spooling dir source for flume agent.
    3. You could try to create another script to read your remote data and then to write it into its output and use such script in the Exec Source.
    4. You could locate your flume (and agent) on the machine, where data is located (see Can Spool Dir of flume be in remote machine? ).