Search code examples
scalaapache-sparkdataframespark-shell

Scala, Spark-shell, Groupby failing


I have Spark version 2.4.0 and scala version 2.11.12. I can sucessfully load a dataframe with the following code.

val df = spark.read.format("csv").option("header","true").option("delimiter","|").option("mode","DROPMALFORMED").option("maxColumns",60000).load("MAR18.csv")

However, when I attempt to do a groupby the following I get an error.

df.groupby("S0102_gender").agg(sum("Respondent.Serial")).show()

The error message is:

error: value groupby is not a member of org.apache.spark.sql.DataFrame

What am I missing. A complete Scala and Spark Newb.


Solution

  • Instead of groupby it should be groupBy like below... clearly typo error.

    df.groupBy("S0102_gender").agg(sum("Respondent.Serial")).show()