Search code examples
sybasesqoop

Not able to Export to Sybase from Hive table


I am trying to export all records from a Hive table to Sybase table using sqoop export. I don't want to export to Sybase from HDFS directory . The Sybase driver jar has been placed inside sqoop directory.

Sqoop Command:

sqoop export  \
 --connect jdbc:sybase:Tds:dbipaddress:0000/DATABASE=omega \
 --username dummy \
 --password dummy\
 --driver com.sybase.jdbc4.jdbc.SybDriver \
 --table omega_events_sybase \
 --hcatalog-table demo.omega_events_hive

While doing that I get the below error

17/04/20 16:44:05 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.5.1
17/04/20 16:44:05 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/04/20 16:44:05 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
17/04/20 16:44:05 INFO manager.SqlManager: Using default fetchSize of 1000
17/04/20 16:44:05 INFO tool.CodeGenTool: Beginning code generation
17/04/20 16:44:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM omega_events AS t WHERE 1=0
 17/04/20 16:44:06 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM omega_events AS t WHERE 1=0
Note: Recompile with -Xlint:deprecation for details.
17/04/20 16:44:08 INFO mapreduce.ExportJobBase: Beginning export of omega_events
17/04/20 16:44:09 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/04/20 16:44:09 INFO mapreduce.ExportJobBase: Configuring HCatalog for export job
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hive/hcatalog/mapreduce/HCatOutputFormat
    at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:420)
    at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:912)
    at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:81)
    at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.lang.ClassNotFoundException: org.apache.hive.hcatalog.mapreduce.HCatOutputFormat
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

Could someone help me on resolving this issue?


Solution

  • You are missing the HCAT libraries. If Hive is already available in the node from where Sqoop command is executed. Add these environment variables,

    export HCAT_HOME=$HIVE_HOME/hcatalog
    export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HCAT_HOME/share/hcatalog/*
    

    Else, download the hive-hcatalog-core-<version>.jar from here and add it to $SQOOP_HOME/lib/.