I am trying to get scores array from TF-IDF result vector. For example:
rescaledData.select("words", "features").show()
+-----------------------------+---------------------------------------------------------------------------------------------+
|words |features |
+-----------------------------+---------------------------------------------------------------------------------------------+
|[a, b, c] |(4527,[0,1,31],[0.6363067860791387,1.0888040725098247,4.371858972705023]) |
|[d] |(4527,[8],[2.729945780576634]) |
+-----------------------------+---------------------------------------------------------------------------------------------+
rescaledData.select(rescaledData['features'].getItem('values')).show()
But instead of array i got an error.
AnalysisException: u"Can't extract value from features#1786: need struct type but got struct<type:tinyint,size:int,indices:array<int>,values:array<double>>;"
What i want is
+--------------------------+-----------------------------------------------------------+
|words |features |
+--------------------------+-----------------------------------------------------------+
|[a, b, c] |[0.6363067860791387, 1.0888040725098247, 4.371858972705023]|
+--------------------------+-----------------------------------------------------------+
How to fix this?
Another option is to create a udf to get values from the sparse vector:
from pyspark.sql.functions import udf
from pyspark.sql.types import DoubleType, ArrayType
sparse_values = udf(lambda v: v.values.tolist(), ArrayType(DoubleType()))
df.withColumn("features", sparse_values("features")).show(truncate=False)
+---------+-----------------------------------------------------------+
|word |features |
+---------+-----------------------------------------------------------+
|[a, b, c]|[0.6363067860791387, 1.0888040725098247, 4.371858972705023]|
|[d] |[2.729945780576634] |
+---------+-----------------------------------------------------------+