Search code examples
mysqlhdfshadoop2sqoop

Apache Sqoop : scoop-import giving Undefined error.


I am working with Apache Hadoop and Apache Sqoop. And I'm trying to import the mysql tables into the hdfs.

Here is the command which i am executing:

sqoop-import --connect jdbc:mysql://localhost/billing_engine -username root -password root > --table cpDetail;

I have setup the Sqoop home environment variable as follows:

export SQOOP_HOME=/Users/bng/Documents/sqoop-1.4.6.bin__hadoop-2.0.4-alpha
export PATH=$PATH:$SQOOP_HOME/bin

But executing the above command, gives me the following error:

readlink: illegal option -- f
usage: readlink [-n] [file ...]
usage: dirname path
/Users/bng/Documents/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/bin/sqoop-import: line 26: /Users/bng/sqoop: Undefined error: 0

Here is the screenshot showing my name node: enter image description here

Please suggest, where am i wrong?


Solution

  • This is the right command which i needed to use:

    sqoop import --connect jdbc:mysql://localhost/billing_engine?useSSL=false --username bil --password bil --table cpdetail -m 1 --target-dir /sqoopFromMysql
    

    The details of the command is as follows;

    1. Sqoop import : the command telling to use sqoop's import command
    2. --connect : indicating the connection to be used
    3. jdbc:mysql://localhost/billing_engine?useSSL=false : connecting to mysql using the jdbc. The host for db is localhost and database name is billing_engine. useSSL=false specifies that we are not connecting over SSL layer.
    4. --username bil --password bil : specifies the username and password for the database.
    5. --table cpdetail : specifies the particular table
    6. -m 1 : specifies the maps to be used
    7. --target-dir /sqoopFromMysql : specifies the target directory in HDFS, where the data will be imported.