I am using spark-sql-2.4.1v
, spark-cassandra-connector-2.4.1v
with Java. In order to write dataframe into Cassandra db, I am creating a spark
SparkConf conf = new SparkConf(true)
.set("spark.cassandra.connection.host",cassandraConfig.getHosts())
.set( ...).
using which I am creating SparkSession as below
spark = SparkSession
.builder()
.appName(appName)
.config("spark.master",deploymentMaster)
.config(conf)
.getOrCreate();
Using the same I am reading the data from Cassandra table.
Instead of fixed SparkConf
, I would like to set few more Cassandra properties dynamically and then create SparkSession using which I want to read data from Cassandra table.
How can this be done?
There are ways by which you can set conf in existing sqlContext or sparkContext.
To add config to existing sparkContext:
ss.sparkContext.getConf.set("key","value")
To add config to existing sqlContext:
ss.sqlContext.setConf("key","value")
To get existing sparkConf:
ss.sparkContext.getConf()
Also config can be set in spark-submit using
spark-submit --conf spark.cassandra.connection.host=