Search code examples
scalaapache-sparkapache-spark-sqlspark-structured-streaming

How to do higher order function transform with sub query and a map lookup?


This is a follow up question of my previous question

scala> val map1 = spark.sql("select map('s1', 'p1', 's2', 'p2', 's3', 'p3') as lookup")

map1: org.apache.spark.sql.DataFrame = [lookup: map<string,string>]

scala> val ds1 = spark.sql("select 'p1' as p, Array('s2','s3') as c")

ds1: org.apache.spark.sql.DataFrame = [p: string, c: array]

scala>  ds1.createOrReplaceTempView("ds1")

scala> map1.createOrReplaceTempView("map1")

scala> map1.show()
+--------------------+
|              lookup|
+--------------------+
|[p1 -> s1, p2 -> ...|
+--------------------+


scala> ds1.show()
+---+--------+
|  p|       c|
+---+--------+
| p1|[s2, s3]|
+---+--------+

map1.selectExpr("element_at(`lookup`, 's2')").first()

res50: org.apache.spark.sql.Row = [p2]

scala> spark.sql("select element_at(`lookup`, 's1') from map1").show()
+----------------------+
|element_at(lookup, s1)|
+----------------------+
|                    p1|
+----------------------+

So far so good. In my next two steps I am hitting some issues:

scala> ds1.selectExpr("p", "c", "transform(c, cs -> map1.selectExpr('element_at(`lookup`, cs)')) as cs").show()

20/09/28 19:44:59 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout does not exist 20/09/28 19:44:59 WARN HiveConf: HiveConf of name hive.stats.retries.wait does not exist 20/09/28 19:45:03 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 2.3.0 20/09/28 19:45:03 WARN ObjectStore: setMetaStoreSchemaVersion called but recording version is disabled: version = 2.3.0, comment = Set by MetaStore [email protected] 20/09/28 19:45:03 WARN ObjectStore: Failed to get database map1, returning NoSuchObjectException org.apache.spark.sql.AnalysisException: Undefined function: 'selectExpr'. This function is neither a registered temporary function nor a permanent function registered in the database 'map1'.; line 1 pos 19

scala> spark.sql("""select p, c, transform(c, cs -> (select element_at(`lookup`, cs) from map1)) cc from ds1""").show()

org.apache.spark.sql.AnalysisException: cannot resolve 'cs' given input columns: [map1.lookup]; line 1 pos 61; 'Project [p#329, c#330, transform(c#330, lambdafunction(scalar-subquery#713 [], lambda cs#715, false)) AS cc#714] : +- 'Project [unresolvedalias('element_at(lookup#327, 'cs), None)] : +- SubqueryAlias map1 : +- Project [map(s1, p1, s2, p2, s3, p3) AS lookup#327] : +- OneRowRelation +- SubqueryAlias ds1 +- Project [p1 AS p#329, array(s2, s3) AS c#330] +- OneRowRelatio

How can I solve these issues?


Solution

  • Simply add the table name to the from clauses.

    spark.sql("""select p, c, transform(c, cs -> element_at(`lookup`, cs)) cc from ds1 a, map1 b""").show()
    
    +---+--------+--------+
    |  p|       c|      cc|
    +---+--------+--------+
    | p1|[s2, s3]|[p2, p3]|
    +---+--------+--------+