Search code examples
javascalaapache-sparkapache-spark-sqlapache-spark-dataset

Cast Spark dataframe existing schema at once


I have a dataframe that all of its columns are of String type, and I have a schema that contains the wanted type for each column. Is there any way of inserting the conversion into a one big try/ catch clause and covert the whole schema dynamically at once? The only solution I've seen is to handle each column specifically and convert its type.


Solution

  • Try:

    val newDf = sparkSession.createDataFrame(oldDf.rdd, schema)