I have a CassandraRow object that contains values of a row. I read it from a one table. I want to write that same object to another table. But then I get this error:
requirement failed: Columns not found in class com.datastax.spark.connector.japi.CassandraRow: [myColumn1, myColumns2, ...]
I tried to pass my own mapping by creating a Map and passing it in the function. This is my code:
CassandraRow row = fetch();
Map<String, String> mapping = Map.of("myColumn1", "myColumn1", "myColumns2", "myColumns2"....);
JavaSparkContext ctx = new JavaSparkContext(conf);
JavaRDD<CassandraRow> insightRDD = ctx.parallelize(List.of(row));
CassandraJavaUtil.javaFunctions(insightRDD).writerBuilder("mykeyspace", "mytable",
CassandraJavaUtil.mapToRow(CassandraRow.class, mapping)).saveToCassandra(); //I also tried without mapping
Any help is appreciated. I have tried POJO approach and it is working. But I don't want to be restricted to creating POJOs. I want a generic approach that would work with any table and any row.
I could not find a way to generalize my solution using Apache Spark. So I use Datastax Java Driver for Apache Cassandra and wrote SQL queries. That was generic enough for me.