Search code examples
javaapache-sparkapache-spark-sqlapache-spark-ml

Spark Dataset<Row> Vector column to Array type conversion


I have a column "features" which is a vector. Is there a way to convert this Vector column to Array column? I am using Spark 2.3 and Java. Actually, the final objective is to split the Vector into individual columns. Thank you.


Solution

  • This can be done with UserDefinedFunction. You can define one like this:

    import org.apache.spark.sql.types.*;
    import org.apache.spark.sql.expressions.UserDefinedFunction;
    import static org.apache.spark.sql.functions.*;
    
    UserDefinedFunction toarray = udf(
      (Vector v) -> v.toArray(),  new ArrayType(DataTypes.DoubleType, false)
    );
    

    and then apply it on a Column:

    import org.apache.spark.sql.Column;
    
    Column featutesArray = toarray.apply(col("features"));
    

    where the result can be used with select or withColumn.

    the final objective is to split the Vector into individual columns.

    That's just a matter of simple indexing - Spark Scala: How to convert Dataframe[vector] to DataFrame[f1:Double, ..., fn: Double)]