Firstly I want to clarify that create query is working fine.
when doing the following: query 1 & query 3 results in error on executing query 3 but query 2 & 3 works fine.I can't find anythong on the net.
class test {
public static void main(String args[]) {
Connection connection;
try {
Class.forName("org.apache.phoenix.jdbc.PhoenixDriver");
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
connection = DriverManager.getConnection("jdbc:phoenix:localhost:2181/hbase-insecure");
//(note:i have tried without /hbase - insecure the result is same)
//query 1:->
connection.createStatement().executeUpdate("UPSERT INTO tableName VALUES('1','randomValue','randomValue',1234567890, 'randomValue', 'randomValue')");
//query 2:->
connection.createStatement().executeUpdate("CREATE TABLE IF NOT EXISTS tableName (A VARCHAR(40), Z.B.type VARCHAR, Z.C VARCHAR, Z.D UNSIGNED_LONG, Z.E VARCHAR,X.F VARCHAR CONSTRAINT rowkey PRIMARY KEY (A))");
//query 3:->
connection.commit();
}
}
error:Exception in thread "streaming-job-executor-0" java.lang.NoSuchMethodError: org.apache.hadoop.hbase.KeyValueUtil.length(Lorg/apache/hadoop/hbase/Cell;)I at org.apache.phoenix.util.PhoenixKeyValueUtil.calculateMutationDiskSize(PhoenixKeyValueUtil.java:182) at org.apache.phoenix.execute.MutationState.calculateMutationSize(MutationState.java:800) at org.apache.phoenix.execute.MutationState.send(MutationState.java:971) at org.apache.phoenix.execute.MutationState.send(MutationState.java:1344) at org.apache.phoenix.execute.MutationState.commit(MutationState.java:1167) at org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:670) at org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:666) at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) at org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:666) at com.kratinmobile.uep.services.SparkStream.lambda$null$0(SparkStream.java:119) at java.lang.Iterable.forEach(Iterable.java:75) at com.kratinmobile.uep.services.SparkStream.lambda$startStreaming$10899135$1(SparkStream.java:102) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$1.apply(JavaDStreamLike.scala:272) at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628) at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748)
Looking at the stack trace it most likely looks like a classpath mistmatch or a version mismatch.