Search code examples
scaladataframeapache-sparkapache-spark-sqlcase-class

How to create schema in Spark with Scala if more than 100 columns in the input?


with case class we have some restrictions... with StructType is it possible to for 100+ columns, Is there any other way to create scheme for around 600+ columns.


Solution

  • val columns = (1 to 600).map(i => s"Column_$i").map(cname => StructField(cname, StringType))
    val schemaWithSixHundredsColumns = StructType(columns)
    val df = spark.createDataFrame(new java.util.ArrayList[Row](), schemaWithSixHundredsColumns)