sqoop not import datatype varchar2 to hadoop I have a table in oracle Database and I want import the data to hdfs. I am trying to do it with sqoop, but varchar2 columns are not imported. I mean that these data isn't arriving to hdfs file. my sqoop command
sqoop import -D mapred.job.name='default oraoop' --driver oracle.jdbc.driver.OracleDriver --connect "jdbc:oracle:thin:MyIp:MyServiceName" --username "XXXX" --password "XX" --target-dir "My_dir" --query 'select * from MyTable where $CONDITIONS' --split-by "coulmn" --boundary-query "SELECT min(splitColumn),max(SplitCoulmn) FROM DUAL" --num-mappers 30
you can try to downgrade the ojdbc instead of using higher ojdbc "ojdbc6 or ojdbc7" use "ojdbc14" this solved the problem for me but in order not to face an exception with some encoding classes not being found remove or rename the "ori18n.jar" while importing data from the orale9i.
you can find the paths to these jar files in "$HADOOP_CLASSPATH" and "$SQOOP_HOME"