Search code examples
hadoopapache-sparkhivehivecontext

How to get HiveContext from JavaSparkContext


In some Spark codes, I have seen that programmers use such code to create SparkContext

 SparkSession session = SparkSession
      .builder()
      .appName("Spark Hive Example")
      .config("spark.sql.warehouse.dir", warehouseLocation)
      .enableHiveSupport()
      .getOrCreate();

But I have always used such kind of code to create JavaSparkContext.

SparkConf sparkConf = new SparkConf().setAppName("Simple App").setMaster("local");
JavaSparkContext spark = new JavaSparkContext(sparkConf);

From the latter part of the code, is there any way I could get a Hive Context to perform operations on Hive Tables?

Thanks!


Solution

  • Finally found the solution.

    SparkSession spark = SparkSession
                        .builder()
                        .appName("SampleApp")
                        .master("local")
                        .enableHiveSupport()
                        .getOrCreate();
    
    JavaSparkContext jsc = new JavaSparkContext(spark.sparkContext());